sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
| tokens_length
sequencelengths 1
353
| input_texts
sequencelengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2045296c3b6e71ca41d4852cd2a342ea9d126703 |
Dataset for the PayMoneyWubby model, available here: https://huggingface.co/ScottishHaze/PayMoneyWubby | ScottishHaze/PayMoneyWubby | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2023-12-29T17:38:29+00:00 | {"license": "cc-by-nc-sa-4.0"} | 2023-12-29T17:45:57+00:00 | [] | [] | TAGS
#license-cc-by-nc-sa-4.0 #region-us
|
Dataset for the PayMoneyWubby model, available here: URL | [] | [
"TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n"
] | [
19
] | [
"passage: TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n"
] |
50f9461d73aa8ce5cb33eadb417032cf1ecc9362 |
# Dataset Card for Evaluation run of OpenPipe/mistral-ft-optimized-1227
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenPipe/mistral-ft-optimized-1227](https://huggingface.co/OpenPipe/mistral-ft-optimized-1227) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1227",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:09:55.411463](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1227/blob/main/results_2023-12-30T02-09-55.411463.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.655214805560653,
"acc_stderr": 0.03196791889189701,
"acc_norm": 0.6555902741913247,
"acc_norm_stderr": 0.032620412320903916,
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405338,
"mc2": 0.5451472575391979,
"mc2_stderr": 0.015603038482785155
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175452,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6786496713802032,
"acc_stderr": 0.004660405565338758,
"acc_norm": 0.8589922326229835,
"acc_norm_stderr": 0.0034731828909689696
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924003,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908353,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908353
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634286,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634286
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.034063153607115086,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.034063153607115086
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934725,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934725
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323792,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323792
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.016269088663959406,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.016269088663959406
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47979139504563234,
"acc_stderr": 0.012759801427767566,
"acc_norm": 0.47979139504563234,
"acc_norm_stderr": 0.012759801427767566
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740533,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740533
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.02448448716291397,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.02448448716291397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405338,
"mc2": 0.5451472575391979,
"mc2_stderr": 0.015603038482785155
},
"harness|winogrande|5": {
"acc": 0.7884767166535123,
"acc_stderr": 0.011477747684223188
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337697
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1227 | [
"region:us"
] | 2023-12-29T17:41:16+00:00 | {"pretty_name": "Evaluation run of OpenPipe/mistral-ft-optimized-1227", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenPipe/mistral-ft-optimized-1227](https://huggingface.co/OpenPipe/mistral-ft-optimized-1227) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1227\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T02:09:55.411463](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenPipe__mistral-ft-optimized-1227/blob/main/results_2023-12-30T02-09-55.411463.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.655214805560653,\n \"acc_stderr\": 0.03196791889189701,\n \"acc_norm\": 0.6555902741913247,\n \"acc_norm_stderr\": 0.032620412320903916,\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405338,\n \"mc2\": 0.5451472575391979,\n \"mc2_stderr\": 0.015603038482785155\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175452,\n \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6786496713802032,\n \"acc_stderr\": 0.004660405565338758,\n \"acc_norm\": 0.8589922326229835,\n \"acc_norm_stderr\": 0.0034731828909689696\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924003,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924003\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908353,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908353\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217483,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217483\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634286,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634286\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934725,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934725\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.013265346261323792,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.013265346261323792\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n \"acc_stderr\": 0.016269088663959406,\n \"acc_norm\": 0.3843575418994413,\n \"acc_norm_stderr\": 0.016269088663959406\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47979139504563234,\n \"acc_stderr\": 0.012759801427767566,\n \"acc_norm\": 0.47979139504563234,\n \"acc_norm_stderr\": 0.012759801427767566\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740533,\n \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740533\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.02448448716291397,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.02448448716291397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405338,\n \"mc2\": 0.5451472575391979,\n \"mc2_stderr\": 0.015603038482785155\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7884767166535123,\n \"acc_stderr\": 0.011477747684223188\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \"acc_stderr\": 0.012454841668337697\n }\n}\n```", "repo_url": "https://huggingface.co/OpenPipe/mistral-ft-optimized-1227", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|arc:challenge|25_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|gsm8k|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hellaswag|10_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T17-38-56.111573.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-09-55.411463.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["**/details_harness|winogrande|5_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["**/details_harness|winogrande|5_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T02-09-55.411463.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T17_38_56.111573", "path": ["results_2023-12-29T17-38-56.111573.parquet"]}, {"split": "2023_12_30T02_09_55.411463", "path": ["results_2023-12-30T02-09-55.411463.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T02-09-55.411463.parquet"]}]}]} | 2023-12-30T02:12:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenPipe/mistral-ft-optimized-1227
Dataset automatically created during the evaluation run of model OpenPipe/mistral-ft-optimized-1227 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T02:09:55.411463(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of OpenPipe/mistral-ft-optimized-1227\n\n\n\nDataset automatically created during the evaluation run of model OpenPipe/mistral-ft-optimized-1227 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:09:55.411463(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenPipe/mistral-ft-optimized-1227\n\n\n\nDataset automatically created during the evaluation run of model OpenPipe/mistral-ft-optimized-1227 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:09:55.411463(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenPipe/mistral-ft-optimized-1227\n\n\n\nDataset automatically created during the evaluation run of model OpenPipe/mistral-ft-optimized-1227 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T02:09:55.411463(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
9f3ed49c439ebeb3ac77833bdd4b196364dc88ec |
# Dataset Card for Evaluation run of mlabonne/NeuralPipe-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/NeuralPipe-7B-slerp](https://huggingface.co/mlabonne/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T12:33:01.505276](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-slerp/blob/main/results_2024-01-05T12-33-01.505276.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6445269708058093,
"acc_stderr": 0.03218714474134609,
"acc_norm": 0.6449418405596148,
"acc_norm_stderr": 0.03284511879516387,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598408044881861,
"mc2_stderr": 0.015149948573522944
},
"harness|arc:challenge|25": {
"acc": 0.6476109215017065,
"acc_stderr": 0.013960142600598675,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.013678810399518829
},
"harness|hellaswag|10": {
"acc": 0.6701852220673172,
"acc_stderr": 0.0046918486653990685,
"acc_norm": 0.8616809400517825,
"acc_norm_stderr": 0.003445289925011734
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.0257449025322909,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.0257449025322909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323793,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.016083749986853697,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.016083749986853697
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015058,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.598408044881861,
"mc2_stderr": 0.015149948573522944
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.01120186274448705
},
"harness|gsm8k|5": {
"acc": 0.6823351023502654,
"acc_stderr": 0.012824066621488845
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-slerp | [
"region:us"
] | 2023-12-29T17:47:15+00:00 | {"pretty_name": "Evaluation run of mlabonne/NeuralPipe-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [mlabonne/NeuralPipe-7B-slerp](https://huggingface.co/mlabonne/NeuralPipe-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T12:33:01.505276](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-slerp/blob/main/results_2024-01-05T12-33-01.505276.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6445269708058093,\n \"acc_stderr\": 0.03218714474134609,\n \"acc_norm\": 0.6449418405596148,\n \"acc_norm_stderr\": 0.03284511879516387,\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598408044881861,\n \"mc2_stderr\": 0.015149948573522944\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598675,\n \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518829\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6701852220673172,\n \"acc_stderr\": 0.0046918486653990685,\n \"acc_norm\": 0.8616809400517825,\n \"acc_norm_stderr\": 0.003445289925011734\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.0257449025322909,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.0257449025322909\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.013265346261323793,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.013265346261323793\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.598408044881861,\n \"mc2_stderr\": 0.015149948573522944\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.01120186274448705\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6823351023502654,\n \"acc_stderr\": 0.012824066621488845\n }\n}\n```", "repo_url": "https://huggingface.co/mlabonne/NeuralPipe-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|arc:challenge|25_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|arc:challenge|25_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|gsm8k|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|gsm8k|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hellaswag|10_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hellaswag|10_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T17-44-55.770154.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T12-33-01.505276.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["**/details_harness|winogrande|5_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["**/details_harness|winogrande|5_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T12-33-01.505276.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T17_44_55.770154", "path": ["results_2023-12-29T17-44-55.770154.parquet"]}, {"split": "2024_01_05T12_33_01.505276", "path": ["results_2024-01-05T12-33-01.505276.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T12-33-01.505276.parquet"]}]}]} | 2024-01-05T12:35:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mlabonne/NeuralPipe-7B-slerp
Dataset automatically created during the evaluation run of model mlabonne/NeuralPipe-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T12:33:01.505276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mlabonne/NeuralPipe-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralPipe-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T12:33:01.505276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mlabonne/NeuralPipe-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralPipe-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T12:33:01.505276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mlabonne/NeuralPipe-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralPipe-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T12:33:01.505276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
6a955c7be0bbbabcbf455a415dad7ca9514bfd32 | Dataset descriptions:
- lichess_6gb: 6GB of 16 million games from lichess's database. No elo filtering performed. Comprised of games from lichess 2016-06 and 2017-05.
- lichess_9gb: 9GB of games from lichess's database. No elo filtering performed. Comprised of gameds from lichess 2017-07 and 2017-08.
- Lichess_gt_18k: ~4GB of games from lichess. Per OpenAI's weak to strong generalization paper, filtered to only include games where white is > 1800 ELO.
- Stockfish: 4.5GB of games generated by White playing as Stockfish ELO 3200 against a range of Stockfish ELO 1300-3200 as black.
- Lichess-stockfish mix: a 50 / 50 mix of > 1800 ELO lichess games and stockfish generated games
- Lichess results: lichess, but we include the result before every game. Hopefully, we can then prompt the model with ";1-0#1.", indicating to the model that it's supposed to win this game.
- lichess_200k_elo_bins: We include a maximum of 200k games from every 100 Elo bucket, so the model trains on a more even distribution of Elos.
Blocks dataset include only one column and are used for training. Every cell is a batch I created that is 1024 characters long. Datasets without "blocks" in the name
contain metadata like player skill, result, etc.
This script is used to create the batches of 1024 characters from a file with a bunch of PGNs: https://github.com/adamkarvonen/chess_gpt_eval/blob/dataset_generation/logs/batching.ipynb | adamkarvonen/chess_games | [
"region:us"
] | 2023-12-29T17:55:22+00:00 | {} | 2024-01-23T20:24:12+00:00 | [] | [] | TAGS
#region-us
| Dataset descriptions:
- lichess_6gb: 6GB of 16 million games from lichess's database. No elo filtering performed. Comprised of games from lichess 2016-06 and 2017-05.
- lichess_9gb: 9GB of games from lichess's database. No elo filtering performed. Comprised of gameds from lichess 2017-07 and 2017-08.
- Lichess_gt_18k: ~4GB of games from lichess. Per OpenAI's weak to strong generalization paper, filtered to only include games where white is > 1800 ELO.
- Stockfish: 4.5GB of games generated by White playing as Stockfish ELO 3200 against a range of Stockfish ELO 1300-3200 as black.
- Lichess-stockfish mix: a 50 / 50 mix of > 1800 ELO lichess games and stockfish generated games
- Lichess results: lichess, but we include the result before every game. Hopefully, we can then prompt the model with ";1-0#1.", indicating to the model that it's supposed to win this game.
- lichess_200k_elo_bins: We include a maximum of 200k games from every 100 Elo bucket, so the model trains on a more even distribution of Elos.
Blocks dataset include only one column and are used for training. Every cell is a batch I created that is 1024 characters long. Datasets without "blocks" in the name
contain metadata like player skill, result, etc.
This script is used to create the batches of 1024 characters from a file with a bunch of PGNs: URL | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
0069f23a233e7bbf460b28a075db9700fe66d5a3 |
# Dataset Card for Evaluation run of mlabonne/NeuralPipe-7B-ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/NeuralPipe-7B-ties](https://huggingface.co/mlabonne/NeuralPipe-7B-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T18:23:14.168913](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-ties/blob/main/results_2023-12-29T18-23-14.168913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6463521388608168,
"acc_stderr": 0.03211851750269888,
"acc_norm": 0.6467147686147078,
"acc_norm_stderr": 0.032775359620950996,
"mc1": 0.44430844553243576,
"mc1_stderr": 0.01739458625074317,
"mc2": 0.6137333376102074,
"mc2_stderr": 0.015322517797295732
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839157,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946533
},
"harness|hellaswag|10": {
"acc": 0.6729735112527385,
"acc_stderr": 0.004681682605347881,
"acc_norm": 0.8603863772156941,
"acc_norm_stderr": 0.0034587739347195527
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.02424378399406216,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.02424378399406216
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02675640153807897,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02675640153807897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579665,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579665
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608304,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546835,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.016175692013381964,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.016175692013381964
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897229,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897229
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44430844553243576,
"mc1_stderr": 0.01739458625074317,
"mc2": 0.6137333376102074,
"mc2_stderr": 0.015322517797295732
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.011201862744487048
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515427
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-ties | [
"region:us"
] | 2023-12-29T18:25:28+00:00 | {"pretty_name": "Evaluation run of mlabonne/NeuralPipe-7B-ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [mlabonne/NeuralPipe-7B-ties](https://huggingface.co/mlabonne/NeuralPipe-7B-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T18:23:14.168913](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralPipe-7B-ties/blob/main/results_2023-12-29T18-23-14.168913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6463521388608168,\n \"acc_stderr\": 0.03211851750269888,\n \"acc_norm\": 0.6467147686147078,\n \"acc_norm_stderr\": 0.032775359620950996,\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.01739458625074317,\n \"mc2\": 0.6137333376102074,\n \"mc2_stderr\": 0.015322517797295732\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839157,\n \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946533\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6729735112527385,\n \"acc_stderr\": 0.004681682605347881,\n \"acc_norm\": 0.8603863772156941,\n \"acc_norm_stderr\": 0.0034587739347195527\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.02424378399406216,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.02424378399406216\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579665,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579665\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608304,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608304\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546835,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546835\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n \"acc_stderr\": 0.016175692013381964,\n \"acc_norm\": 0.37318435754189944,\n \"acc_norm_stderr\": 0.016175692013381964\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897229,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897229\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.01739458625074317,\n \"mc2\": 0.6137333376102074,\n \"mc2_stderr\": 0.015322517797295732\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487048\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \"acc_stderr\": 0.012679297549515427\n }\n}\n```", "repo_url": "https://huggingface.co/mlabonne/NeuralPipe-7B-ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|arc:challenge|25_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|gsm8k|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hellaswag|10_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T18-23-14.168913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["**/details_harness|winogrande|5_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T18-23-14.168913.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T18_23_14.168913", "path": ["results_2023-12-29T18-23-14.168913.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T18-23-14.168913.parquet"]}]}]} | 2023-12-29T18:25:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mlabonne/NeuralPipe-7B-ties
Dataset automatically created during the evaluation run of model mlabonne/NeuralPipe-7B-ties on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T18:23:14.168913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mlabonne/NeuralPipe-7B-ties\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralPipe-7B-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T18:23:14.168913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mlabonne/NeuralPipe-7B-ties\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralPipe-7B-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T18:23:14.168913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mlabonne/NeuralPipe-7B-ties\n\n\n\nDataset automatically created during the evaluation run of model mlabonne/NeuralPipe-7B-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T18:23:14.168913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
85144beff6ee0cae283a533ad97a7214855bb0c8 | Persuasion for Good: Towards a Personalized Persuasive Dialogue System for Social Good
Dataset and Codebase for Persuasion for Good: Towards a Personalized Persuasive Dialogue System for Social Good
published as a long paper in ACL 2019.
https://arxiv.org/abs/1906.06725
If you use the datasets or any source codes included in this repository in your
work, please cite the following paper. The bibtex is listed below:
@article{wang2019persuasion,
title={Persuasion for Good: Towards a Personalized Persuasive Dialogue System for Social Good},
author={Wang, Xuewei and Shi, Weiyan and Kim, Richard and Oh, Yoojung and Yang, Sijia and Zhang, Jingwen and Yu, Zhou},
journal={arXiv preprint arXiv:1906.06725},
year={2019}
}
B2: Dialogue ID
B4: Role (0 means persuader, 1 means persuadee)
Turn: Turn index
Unit: Sentence in utterance
| spawn99/PersuasionForGood | [
"license:mit",
"arxiv:1906.06725",
"region:us"
] | 2023-12-29T18:37:40+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "Unnamed: 0", "dtype": "int64"}, {"name": "Unit", "dtype": "string"}, {"name": "Turn", "dtype": "int64"}, {"name": "B4", "dtype": "int64"}, {"name": "B2", "dtype": "string"}], "splits": [{"name": "FullDialog", "num_bytes": 3043959, "num_examples": 20932}], "download_size": 1186349, "dataset_size": 3043959}, "configs": [{"config_name": "default", "data_files": [{"split": "FullDialog", "path": "data/FullDialog-*"}]}]} | 2023-12-29T20:09:42+00:00 | [
"1906.06725"
] | [] | TAGS
#license-mit #arxiv-1906.06725 #region-us
| Persuasion for Good: Towards a Personalized Persuasive Dialogue System for Social Good
Dataset and Codebase for Persuasion for Good: Towards a Personalized Persuasive Dialogue System for Social Good
published as a long paper in ACL 2019.
URL
If you use the datasets or any source codes included in this repository in your
work, please cite the following paper. The bibtex is listed below:
@article{wang2019persuasion,
title={Persuasion for Good: Towards a Personalized Persuasive Dialogue System for Social Good},
author={Wang, Xuewei and Shi, Weiyan and Kim, Richard and Oh, Yoojung and Yang, Sijia and Zhang, Jingwen and Yu, Zhou},
journal={arXiv preprint arXiv:1906.06725},
year={2019}
}
B2: Dialogue ID
B4: Role (0 means persuader, 1 means persuadee)
Turn: Turn index
Unit: Sentence in utterance
| [] | [
"TAGS\n#license-mit #arxiv-1906.06725 #region-us \n"
] | [
19
] | [
"passage: TAGS\n#license-mit #arxiv-1906.06725 #region-us \n"
] |
2bba5d7a7c57f8fdf2c65bd6c068d68e71a10af6 |
Cornell Movie-Dialogs Corpus
Distributed together with:
"Chameleons in imagined conversations: A new approach to understanding coordination of linguistic style in dialogs"
Cristian Danescu-Niculescu-Mizil and Lillian Lee
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, ACL 2011.
(this paper is included in this zip file)
NOTE: If you have results to report on these corpora, please send email to [email protected] or [email protected] so we can add you to our list of people using this data. Thanks!
Contents of this README:
A) Brief description
B) Files description
C) Details on the collection procedure
D) Contact
A) Brief description:
This corpus contains a metadata-rich collection of fictional conversations extracted from raw movie scripts:
- 220,579 conversational exchanges between 10,292 pairs of movie characters
- involves 9,035 characters from 617 movies
- in total 304,713 utterances
- movie metadata included:
- genres
- release year
- IMDB rating
- number of IMDB votes
- IMDB rating
- character metadata included:
- gender (for 3,774 characters)
- position on movie credits (3,321 characters)
B) Files description:
In all files the field separator is " +++$+++ "
- movie_titles_metadata.txt
- contains information about each movie title
- fields:
- movieID,
- movie title,
- movie year,
- IMDB rating,
- no. IMDB votes,
- genres in the format ['genre1','genre2',�,'genreN']
- movie_characters_metadata.txt
- contains information about each movie character
- fields:
- characterID
- character name
- movieID
- movie title
- gender ("?" for unlabeled cases)
- position in credits ("?" for unlabeled cases)
- movie_lines.txt
- contains the actual text of each utterance
- fields:
- lineID
- characterID (who uttered this phrase)
- movieID
- character name
- text of the utterance
- movie_conversations.txt
- the structure of the conversations
- fields
- characterID of the first character involved in the conversation
- characterID of the second character involved in the conversation
- movieID of the movie in which the conversation occurred
- list of the utterances that make the conversation, in chronological
order: ['lineID1','lineID2',�,'lineIDN']
has to be matched with movie_lines.txt to reconstruct the actual content
- raw_script_urls.txt
- the urls from which the raw sources were retrieved
C) Details on the collection procedure:
We started from raw publicly available movie scripts (sources acknowledged in
raw_script_urls.txt). In order to collect the metadata necessary for this study
and to distinguish between two script versions of the same movie, we automatically
matched each script with an entry in movie database provided by IMDB (The Internet
Movie Database; data interfaces available at http://www.imdb.com/interfaces). Some
amount of manual correction was also involved. When more than one movie with the same
title was found in IMBD, the match was made with the most popular title
(the one that received most IMDB votes)
After discarding all movies that could not be matched or that had less than 5 IMDB
votes, we were left with 617 unique titles with metadata including genre, release
year, IMDB rating and no. of IMDB votes and cast distribution. We then identified
the pairs of characters that interact and separated their conversations automatically
using simple data processing heuristics. After discarding all pairs that exchanged
less than 5 conversational exchanges there were 10,292 left, exchanging 220,579
conversational exchanges (304,713 utterances). After automatically matching the names
of the 9,035 involved characters to the list of cast distribution, we used the
gender of each interpreting actor to infer the fictional gender of a subset of
3,321 movie characters (we raised the number of gendered 3,774 characters through
manual annotation). Similarly, we collected the end credit position of a subset
of 3,321 characters as a proxy for their status.
D) Contact:
Please email any questions to: [email protected] (Cristian Danescu-Niculescu-Mizil) | spawn99/CornellMovieDialogCorpus | [
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"movie dialog",
"cornell",
"conversation",
"dialog",
"region:us"
] | 2023-12-29T18:45:14+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "tags": ["movie dialog", "cornell", "conversation", "dialog"], "dataset_info": {"features": [{"name": "lineID", "dtype": "string"}, {"name": "characterID", "dtype": "string"}, {"name": "movieID", "dtype": "string"}, {"name": "characterName", "dtype": "string"}, {"name": "utterance", "dtype": "string"}], "splits": [{"name": "movie_lines", "num_bytes": 29475700, "num_examples": 304713}], "download_size": 14593268, "dataset_size": 29475700}, "configs": [{"config_name": "default", "data_files": [{"split": "movie_lines", "path": "data/movie_lines-*"}]}]} | 2023-12-29T20:12:38+00:00 | [] | [
"en"
] | TAGS
#size_categories-100K<n<1M #language-English #license-mit #movie dialog #cornell #conversation #dialog #region-us
|
Cornell Movie-Dialogs Corpus
Distributed together with:
"Chameleons in imagined conversations: A new approach to understanding coordination of linguistic style in dialogs"
Cristian Danescu-Niculescu-Mizil and Lillian Lee
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, ACL 2011.
(this paper is included in this zip file)
NOTE: If you have results to report on these corpora, please send email to cristian@URL or llee@URL so we can add you to our list of people using this data. Thanks!
Contents of this README:
A) Brief description
B) Files description
C) Details on the collection procedure
D) Contact
A) Brief description:
This corpus contains a metadata-rich collection of fictional conversations extracted from raw movie scripts:
- 220,579 conversational exchanges between 10,292 pairs of movie characters
- involves 9,035 characters from 617 movies
- in total 304,713 utterances
- movie metadata included:
- genres
- release year
- IMDB rating
- number of IMDB votes
- IMDB rating
- character metadata included:
- gender (for 3,774 characters)
- position on movie credits (3,321 characters)
B) Files description:
In all files the field separator is " +++$+++ "
- movie_titles_metadata.txt
- contains information about each movie title
- fields:
- movieID,
- movie title,
- movie year,
- IMDB rating,
- no. IMDB votes,
- genres in the format ['genre1','genre2',�,'genreN']
- movie_characters_metadata.txt
- contains information about each movie character
- fields:
- characterID
- character name
- movieID
- movie title
- gender ("?" for unlabeled cases)
- position in credits ("?" for unlabeled cases)
- movie_lines.txt
- contains the actual text of each utterance
- fields:
- lineID
- characterID (who uttered this phrase)
- movieID
- character name
- text of the utterance
- movie_conversations.txt
- the structure of the conversations
- fields
- characterID of the first character involved in the conversation
- characterID of the second character involved in the conversation
- movieID of the movie in which the conversation occurred
- list of the utterances that make the conversation, in chronological
order: ['lineID1','lineID2',�,'lineIDN']
has to be matched with movie_lines.txt to reconstruct the actual content
- raw_script_urls.txt
- the urls from which the raw sources were retrieved
C) Details on the collection procedure:
We started from raw publicly available movie scripts (sources acknowledged in
raw_script_urls.txt). In order to collect the metadata necessary for this study
and to distinguish between two script versions of the same movie, we automatically
matched each script with an entry in movie database provided by IMDB (The Internet
Movie Database; data interfaces available at URL Some
amount of manual correction was also involved. When more than one movie with the same
title was found in IMBD, the match was made with the most popular title
(the one that received most IMDB votes)
After discarding all movies that could not be matched or that had less than 5 IMDB
votes, we were left with 617 unique titles with metadata including genre, release
year, IMDB rating and no. of IMDB votes and cast distribution. We then identified
the pairs of characters that interact and separated their conversations automatically
using simple data processing heuristics. After discarding all pairs that exchanged
less than 5 conversational exchanges there were 10,292 left, exchanging 220,579
conversational exchanges (304,713 utterances). After automatically matching the names
of the 9,035 involved characters to the list of cast distribution, we used the
gender of each interpreting actor to infer the fictional gender of a subset of
3,321 movie characters (we raised the number of gendered 3,774 characters through
manual annotation). Similarly, we collected the end credit position of a subset
of 3,321 characters as a proxy for their status.
D) Contact:
Please email any questions to: cristian@URL (Cristian Danescu-Niculescu-Mizil) | [] | [
"TAGS\n#size_categories-100K<n<1M #language-English #license-mit #movie dialog #cornell #conversation #dialog #region-us \n"
] | [
40
] | [
"passage: TAGS\n#size_categories-100K<n<1M #language-English #license-mit #movie dialog #cornell #conversation #dialog #region-us \n"
] |
ed4ef83698f6edc8845486ece5b6847f40c25871 |
# Dataset Card for Evaluation run of YeungNLP/firefly-zephyr-6x7b-lora
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-zephyr-6x7b-lora](https://huggingface.co/YeungNLP/firefly-zephyr-6x7b-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-zephyr-6x7b-lora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T18:51:32.480572](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-zephyr-6x7b-lora/blob/main/results_2023-12-29T18-51-32.480572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5989926693451659,
"acc_stderr": 0.03334318950172643,
"acc_norm": 0.6049039699813348,
"acc_norm_stderr": 0.034036223081089764,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897306,
"mc2": 0.4883734627836678,
"mc2_stderr": 0.015369075462539867
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.014434138713379977,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892896
},
"harness|hellaswag|10": {
"acc": 0.6292571200955985,
"acc_stderr": 0.004820166002253079,
"acc_norm": 0.8280223063134834,
"acc_norm_stderr": 0.0037658983649388727
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.025007329882461217,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.025007329882461217
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.017604304149256483,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.017604304149256483
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593515,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593515
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016117,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016117
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22793296089385476,
"acc_stderr": 0.014030149950805097,
"acc_norm": 0.22793296089385476,
"acc_norm_stderr": 0.014030149950805097
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281416,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281416
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630464,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630464
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41851368970013036,
"acc_stderr": 0.012599505608336467,
"acc_norm": 0.41851368970013036,
"acc_norm_stderr": 0.012599505608336467
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.02989616303312547,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.02989616303312547
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.019873802005061177,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.019873802005061177
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897306,
"mc2": 0.4883734627836678,
"mc2_stderr": 0.015369075462539867
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838234
},
"harness|gsm8k|5": {
"acc": 0.3100833965125095,
"acc_stderr": 0.012740305717376268
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_YeungNLP__firefly-zephyr-6x7b-lora | [
"region:us"
] | 2023-12-29T18:53:50+00:00 | {"pretty_name": "Evaluation run of YeungNLP/firefly-zephyr-6x7b-lora", "dataset_summary": "Dataset automatically created during the evaluation run of model [YeungNLP/firefly-zephyr-6x7b-lora](https://huggingface.co/YeungNLP/firefly-zephyr-6x7b-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-zephyr-6x7b-lora\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T18:51:32.480572](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-zephyr-6x7b-lora/blob/main/results_2023-12-29T18-51-32.480572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5989926693451659,\n \"acc_stderr\": 0.03334318950172643,\n \"acc_norm\": 0.6049039699813348,\n \"acc_norm_stderr\": 0.034036223081089764,\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897306,\n \"mc2\": 0.4883734627836678,\n \"mc2_stderr\": 0.015369075462539867\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.014434138713379977,\n \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892896\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6292571200955985,\n \"acc_stderr\": 0.004820166002253079,\n \"acc_norm\": 0.8280223063134834,\n \"acc_norm_stderr\": 0.0037658983649388727\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.025007329882461217,\n \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.025007329882461217\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7853211009174312,\n \"acc_stderr\": 0.017604304149256483,\n \"acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.017604304149256483\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n \"acc_stderr\": 0.014648172749593515,\n \"acc_norm\": 0.7867177522349936,\n \"acc_norm_stderr\": 0.014648172749593515\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016117,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016117\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22793296089385476,\n \"acc_stderr\": 0.014030149950805097,\n \"acc_norm\": 0.22793296089385476,\n \"acc_norm_stderr\": 0.014030149950805097\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281416,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281416\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630464,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630464\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41851368970013036,\n \"acc_stderr\": 0.012599505608336467,\n \"acc_norm\": 0.41851368970013036,\n \"acc_norm_stderr\": 0.012599505608336467\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.02989616303312547,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.02989616303312547\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061177,\n \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061177\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897306,\n \"mc2\": 0.4883734627836678,\n \"mc2_stderr\": 0.015369075462539867\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838234\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3100833965125095,\n \"acc_stderr\": 0.012740305717376268\n }\n}\n```", "repo_url": "https://huggingface.co/YeungNLP/firefly-zephyr-6x7b-lora", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|arc:challenge|25_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|gsm8k|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hellaswag|10_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T18-51-32.480572.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["**/details_harness|winogrande|5_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T18-51-32.480572.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T18_51_32.480572", "path": ["results_2023-12-29T18-51-32.480572.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T18-51-32.480572.parquet"]}]}]} | 2023-12-29T18:54:11+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of YeungNLP/firefly-zephyr-6x7b-lora
Dataset automatically created during the evaluation run of model YeungNLP/firefly-zephyr-6x7b-lora on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T18:51:32.480572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of YeungNLP/firefly-zephyr-6x7b-lora\n\n\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-zephyr-6x7b-lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T18:51:32.480572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of YeungNLP/firefly-zephyr-6x7b-lora\n\n\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-zephyr-6x7b-lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T18:51:32.480572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of YeungNLP/firefly-zephyr-6x7b-lora\n\n\n\nDataset automatically created during the evaluation run of model YeungNLP/firefly-zephyr-6x7b-lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T18:51:32.480572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
096e5bbabc29a138cb583b24877f57912885a8ae |
# Dataset Card for Evaluation run of abhinand/mistral7b-test001
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhinand/mistral7b-test001](https://huggingface.co/abhinand/mistral7b-test001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhinand__mistral7b-test001",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T19:04:00.297390](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__mistral7b-test001/blob/main/results_2023-12-29T19-04-00.297390.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23246455070135374,
"acc_stderr": 0.02993171081257865,
"acc_norm": 0.232050358997208,
"acc_norm_stderr": 0.03071885273246434,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.5007337530776726,
"mc2_stderr": 0.015879917676609904
},
"harness|arc:challenge|25": {
"acc": 0.21416382252559726,
"acc_stderr": 0.011988383205966515,
"acc_norm": 0.24658703071672355,
"acc_norm_stderr": 0.012595726268790125
},
"harness|hellaswag|10": {
"acc": 0.26628161720772753,
"acc_stderr": 0.004411099046251008,
"acc_norm": 0.26777534355706034,
"acc_norm_stderr": 0.004418948941099411
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.5007337530776726,
"mc2_stderr": 0.015879917676609904
},
"harness|winogrande|5": {
"acc": 0.5232833464877664,
"acc_stderr": 0.014037241309573638
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abhinand__mistral7b-test001 | [
"region:us"
] | 2023-12-29T19:06:23+00:00 | {"pretty_name": "Evaluation run of abhinand/mistral7b-test001", "dataset_summary": "Dataset automatically created during the evaluation run of model [abhinand/mistral7b-test001](https://huggingface.co/abhinand/mistral7b-test001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhinand__mistral7b-test001\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T19:04:00.297390](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__mistral7b-test001/blob/main/results_2023-12-29T19-04-00.297390.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23246455070135374,\n \"acc_stderr\": 0.02993171081257865,\n \"acc_norm\": 0.232050358997208,\n \"acc_norm_stderr\": 0.03071885273246434,\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.5007337530776726,\n \"mc2_stderr\": 0.015879917676609904\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21416382252559726,\n \"acc_stderr\": 0.011988383205966515,\n \"acc_norm\": 0.24658703071672355,\n \"acc_norm_stderr\": 0.012595726268790125\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26628161720772753,\n \"acc_stderr\": 0.004411099046251008,\n \"acc_norm\": 0.26777534355706034,\n \"acc_norm_stderr\": 0.004418948941099411\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.5007337530776726,\n \"mc2_stderr\": 0.015879917676609904\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5232833464877664,\n \"acc_stderr\": 0.014037241309573638\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/abhinand/mistral7b-test001", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-04-00.297390.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["**/details_harness|winogrande|5_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T19-04-00.297390.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T19_04_00.297390", "path": ["results_2023-12-29T19-04-00.297390.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T19-04-00.297390.parquet"]}]}]} | 2023-12-29T19:06:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abhinand/mistral7b-test001
Dataset automatically created during the evaluation run of model abhinand/mistral7b-test001 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T19:04:00.297390(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abhinand/mistral7b-test001\n\n\n\nDataset automatically created during the evaluation run of model abhinand/mistral7b-test001 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:04:00.297390(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abhinand/mistral7b-test001\n\n\n\nDataset automatically created during the evaluation run of model abhinand/mistral7b-test001 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:04:00.297390(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of abhinand/mistral7b-test001\n\n\n\nDataset automatically created during the evaluation run of model abhinand/mistral7b-test001 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T19:04:00.297390(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
e5be4bed807b5d5cb73d88f0c831000e586c157d |
# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.6-mistral-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/dolphin-2.6-mistral-7b](https://huggingface.co/cognitivecomputations/dolphin-2.6-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T00:53:12.910957](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b/blob/main/results_2024-01-05T00-53-12.910957.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6311678740586428,
"acc_stderr": 0.03235623922383324,
"acc_norm": 0.6353556161940662,
"acc_norm_stderr": 0.03299949537775763,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.55647761603073,
"mc2_stderr": 0.015289986307918129
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946707,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.014117971901142822
},
"harness|hellaswag|10": {
"acc": 0.6471818362875921,
"acc_stderr": 0.004768701562988879,
"acc_norm": 0.8405696076478789,
"acc_norm_stderr": 0.003653288043555801
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.0286372356398009,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.0286372356398009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094764,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094764
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431378,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431378
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.01385372417092253,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.01385372417092253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3486033519553073,
"acc_stderr": 0.015937484656687033,
"acc_norm": 0.3486033519553073,
"acc_norm_stderr": 0.015937484656687033
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206244,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206244
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254187,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254187
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.55647761603073,
"mc2_stderr": 0.015289986307918129
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774104
},
"harness|gsm8k|5": {
"acc": 0.4609552691432904,
"acc_stderr": 0.013730428449116327
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b | [
"region:us"
] | 2023-12-29T19:20:58+00:00 | {"pretty_name": "Evaluation run of cognitivecomputations/dolphin-2.6-mistral-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [cognitivecomputations/dolphin-2.6-mistral-7b](https://huggingface.co/cognitivecomputations/dolphin-2.6-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T00:53:12.910957](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b/blob/main/results_2024-01-05T00-53-12.910957.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6311678740586428,\n \"acc_stderr\": 0.03235623922383324,\n \"acc_norm\": 0.6353556161940662,\n \"acc_norm_stderr\": 0.03299949537775763,\n \"mc1\": 0.37209302325581395,\n \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.55647761603073,\n \"mc2_stderr\": 0.015289986307918129\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946707,\n \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.014117971901142822\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6471818362875921,\n \"acc_stderr\": 0.004768701562988879,\n \"acc_norm\": 0.8405696076478789,\n \"acc_norm_stderr\": 0.003653288043555801\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.0286372356398009,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.0286372356398009\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094764,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094764\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431378,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431378\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.01385372417092253,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.01385372417092253\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3486033519553073,\n \"acc_stderr\": 0.015937484656687033,\n \"acc_norm\": 0.3486033519553073,\n \"acc_norm_stderr\": 0.015937484656687033\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.012685906538206244,\n \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.012685906538206244\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254187,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254187\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.55647761603073,\n \"mc2_stderr\": 0.015289986307918129\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774104\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4609552691432904,\n \"acc_stderr\": 0.013730428449116327\n }\n}\n```", "repo_url": "https://huggingface.co/cognitivecomputations/dolphin-2.6-mistral-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|arc:challenge|25_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|gsm8k|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hellaswag|10_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-18-32.219011.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T00-53-12.910957.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["**/details_harness|winogrande|5_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["**/details_harness|winogrande|5_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T00-53-12.910957.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T19_18_32.219011", "path": ["results_2023-12-29T19-18-32.219011.parquet"]}, {"split": "2024_01_05T00_53_12.910957", "path": ["results_2024-01-05T00-53-12.910957.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T00-53-12.910957.parquet"]}]}]} | 2024-01-05T00:56:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.6-mistral-7b
Dataset automatically created during the evaluation run of model cognitivecomputations/dolphin-2.6-mistral-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T00:53:12.910957(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.6-mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model cognitivecomputations/dolphin-2.6-mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T00:53:12.910957(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.6-mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model cognitivecomputations/dolphin-2.6-mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T00:53:12.910957(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.6-mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model cognitivecomputations/dolphin-2.6-mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T00:53:12.910957(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
aca3f17cb7c9887034066b4315f93c1954f4ff56 |
# Dataset Card for Evaluation run of maywell/Synatra-10.7B-v0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maywell/Synatra-10.7B-v0.4](https://huggingface.co/maywell/Synatra-10.7B-v0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__Synatra-10.7B-v0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T19:20:40.576822](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-10.7B-v0.4/blob/main/results_2023-12-29T19-20-40.576822.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6258404665626669,
"acc_stderr": 0.0325401029239991,
"acc_norm": 0.6288101811459054,
"acc_norm_stderr": 0.03320287642998332,
"mc1": 0.3488372093023256,
"mc1_stderr": 0.016684419859986893,
"mc2": 0.511139865309024,
"mc2_stderr": 0.014967148493344726
},
"harness|arc:challenge|25": {
"acc": 0.60580204778157,
"acc_stderr": 0.014280522667467323,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726097
},
"harness|hellaswag|10": {
"acc": 0.6258713403704441,
"acc_stderr": 0.004829081532826499,
"acc_norm": 0.8247361083449513,
"acc_norm_stderr": 0.003794156551272269
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.03784271932887468,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.03784271932887468
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.034867317274198714,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.034867317274198714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603617,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603617
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.02500732988246122,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.02500732988246122
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881564,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881564
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.01684767640009109,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.01684767640009109
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503217,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503217
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092365,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.0143176537085942,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.0143176537085942
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371553,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371553
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686399,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3488372093023256,
"mc1_stderr": 0.016684419859986893,
"mc2": 0.511139865309024,
"mc2_stderr": 0.014967148493344726
},
"harness|winogrande|5": {
"acc": 0.8184688239936859,
"acc_stderr": 0.010833276515007484
},
"harness|gsm8k|5": {
"acc": 0.5003790750568613,
"acc_stderr": 0.013772480761626172
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_maywell__Synatra-10.7B-v0.4 | [
"region:us"
] | 2023-12-29T19:22:54+00:00 | {"pretty_name": "Evaluation run of maywell/Synatra-10.7B-v0.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [maywell/Synatra-10.7B-v0.4](https://huggingface.co/maywell/Synatra-10.7B-v0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__Synatra-10.7B-v0.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T19:20:40.576822](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-10.7B-v0.4/blob/main/results_2023-12-29T19-20-40.576822.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6258404665626669,\n \"acc_stderr\": 0.0325401029239991,\n \"acc_norm\": 0.6288101811459054,\n \"acc_norm_stderr\": 0.03320287642998332,\n \"mc1\": 0.3488372093023256,\n \"mc1_stderr\": 0.016684419859986893,\n \"mc2\": 0.511139865309024,\n \"mc2_stderr\": 0.014967148493344726\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.60580204778157,\n \"acc_stderr\": 0.014280522667467323,\n \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726097\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6258713403704441,\n \"acc_stderr\": 0.004829081532826499,\n \"acc_norm\": 0.8247361083449513,\n \"acc_norm_stderr\": 0.003794156551272269\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.03784271932887468,\n \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.03784271932887468\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.034867317274198714,\n \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.034867317274198714\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603617,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603617\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.02500732988246122,\n \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.02500732988246122\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8091743119266055,\n \"acc_stderr\": 0.01684767640009109,\n \"acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.01684767640009109\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503217,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503217\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092365,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n \"acc_stderr\": 0.0143176537085942,\n \"acc_norm\": 0.7994891443167306,\n \"acc_norm_stderr\": 0.0143176537085942\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n \"acc_stderr\": 0.015461169002371553,\n \"acc_norm\": 0.3094972067039106,\n \"acc_norm_stderr\": 0.015461169002371553\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686399,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686399\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3488372093023256,\n \"mc1_stderr\": 0.016684419859986893,\n \"mc2\": 0.511139865309024,\n \"mc2_stderr\": 0.014967148493344726\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8184688239936859,\n \"acc_stderr\": 0.010833276515007484\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5003790750568613,\n \"acc_stderr\": 0.013772480761626172\n }\n}\n```", "repo_url": "https://huggingface.co/maywell/Synatra-10.7B-v0.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-20-40.576822.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["**/details_harness|winogrande|5_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T19-20-40.576822.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T19_20_40.576822", "path": ["results_2023-12-29T19-20-40.576822.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T19-20-40.576822.parquet"]}]}]} | 2023-12-29T19:23:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of maywell/Synatra-10.7B-v0.4
Dataset automatically created during the evaluation run of model maywell/Synatra-10.7B-v0.4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T19:20:40.576822(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of maywell/Synatra-10.7B-v0.4\n\n\n\nDataset automatically created during the evaluation run of model maywell/Synatra-10.7B-v0.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:20:40.576822(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of maywell/Synatra-10.7B-v0.4\n\n\n\nDataset automatically created during the evaluation run of model maywell/Synatra-10.7B-v0.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:20:40.576822(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of maywell/Synatra-10.7B-v0.4\n\n\n\nDataset automatically created during the evaluation run of model maywell/Synatra-10.7B-v0.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T19:20:40.576822(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
6f1450e4a3ce6e5a0e9bd28ef9f4f66fdb3364e8 |
# Dataset Card for Evaluation run of Felladrin/Smol-Llama-101M-Chat-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Felladrin/Smol-Llama-101M-Chat-v1](https://huggingface.co/Felladrin/Smol-Llama-101M-Chat-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Felladrin__Smol-Llama-101M-Chat-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T19:30:13.746850](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Smol-Llama-101M-Chat-v1/blob/main/results_2023-12-29T19-30-13.746850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24875102412567496,
"acc_stderr": 0.030480837022125038,
"acc_norm": 0.24959862358168325,
"acc_norm_stderr": 0.03127898135128497,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.45756971268130436,
"mc2_stderr": 0.015178620872901784
},
"harness|arc:challenge|25": {
"acc": 0.18686006825938567,
"acc_stderr": 0.011391015649694379,
"acc_norm": 0.22866894197952217,
"acc_norm_stderr": 0.012272853582540795
},
"harness|hellaswag|10": {
"acc": 0.27504481179047996,
"acc_stderr": 0.00445624260195063,
"acc_norm": 0.28689504082852024,
"acc_norm_stderr": 0.0045138774650621254
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108614,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108614
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080343,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080343
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.02185150982203172,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.02185150982203172
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.33225806451612905,
"acc_stderr": 0.02679556084812279,
"acc_norm": 0.33225806451612905,
"acc_norm_stderr": 0.02679556084812279
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.032396370467357015,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.032396370467357015
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.258974358974359,
"acc_stderr": 0.022211106810061672,
"acc_norm": 0.258974358974359,
"acc_norm_stderr": 0.022211106810061672
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.017871217767790222,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.017871217767790222
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.02830465794303529,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.02830465794303529
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21524663677130046,
"acc_stderr": 0.02758406660220826,
"acc_norm": 0.21524663677130046,
"acc_norm_stderr": 0.02758406660220826
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.19083969465648856,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.026453508054040325,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.026453508054040325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925992,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.02417084087934102,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.02417084087934102
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632924,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632924
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880585,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880585
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279338,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279338
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.28308823529411764,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.28308823529411764,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528047,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528047
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.17272727272727273,
"acc_stderr": 0.036206918339292196,
"acc_norm": 0.17272727272727273,
"acc_norm_stderr": 0.036206918339292196
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.32653061224489793,
"acc_stderr": 0.030021056238440317,
"acc_norm": 0.32653061224489793,
"acc_norm_stderr": 0.030021056238440317
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.1890547263681592,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.1890547263681592,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.0317555478662992,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.0317555478662992
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.45756971268130436,
"mc2_stderr": 0.015178620872901784
},
"harness|winogrande|5": {
"acc": 0.500394632991318,
"acc_stderr": 0.014052481306049512
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225172
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Felladrin__Smol-Llama-101M-Chat-v1 | [
"region:us"
] | 2023-12-29T19:31:40+00:00 | {"pretty_name": "Evaluation run of Felladrin/Smol-Llama-101M-Chat-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Felladrin/Smol-Llama-101M-Chat-v1](https://huggingface.co/Felladrin/Smol-Llama-101M-Chat-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Felladrin__Smol-Llama-101M-Chat-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T19:30:13.746850](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Smol-Llama-101M-Chat-v1/blob/main/results_2023-12-29T19-30-13.746850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24875102412567496,\n \"acc_stderr\": 0.030480837022125038,\n \"acc_norm\": 0.24959862358168325,\n \"acc_norm_stderr\": 0.03127898135128497,\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.45756971268130436,\n \"mc2_stderr\": 0.015178620872901784\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.18686006825938567,\n \"acc_stderr\": 0.011391015649694379,\n \"acc_norm\": 0.22866894197952217,\n \"acc_norm_stderr\": 0.012272853582540795\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27504481179047996,\n \"acc_stderr\": 0.00445624260195063,\n \"acc_norm\": 0.28689504082852024,\n \"acc_norm_stderr\": 0.0045138774650621254\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108614,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108614\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.03586879280080343,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.03586879280080343\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23544973544973544,\n \"acc_stderr\": 0.02185150982203172,\n \"acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.02185150982203172\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.33225806451612905,\n \"acc_stderr\": 0.02679556084812279,\n \"acc_norm\": 0.33225806451612905,\n \"acc_norm_stderr\": 0.02679556084812279\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.032396370467357015,\n \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.032396370467357015\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.258974358974359,\n \"acc_stderr\": 0.022211106810061672,\n \"acc_norm\": 0.258974358974359,\n \"acc_norm_stderr\": 0.022211106810061672\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22385321100917432,\n \"acc_stderr\": 0.017871217767790222,\n \"acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.017871217767790222\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.02830465794303529,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.02830465794303529\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21524663677130046,\n \"acc_stderr\": 0.02758406660220826,\n \"acc_norm\": 0.21524663677130046,\n \"acc_norm_stderr\": 0.02758406660220826\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.19083969465648856,\n \"acc_stderr\": 0.034465133507525975,\n \"acc_norm\": 0.19083969465648856,\n \"acc_norm_stderr\": 0.034465133507525975\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.026453508054040325,\n \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.026453508054040325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925992,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925992\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.02417084087934102,\n \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.02417084087934102\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n \"acc_stderr\": 0.025755865922632924,\n \"acc_norm\": 0.28938906752411575,\n \"acc_norm_stderr\": 0.025755865922632924\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.02465968518596728,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.02465968518596728\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880585,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880585\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n \"acc_stderr\": 0.011015752255279338,\n \"acc_norm\": 0.2470664928292047,\n \"acc_norm_stderr\": 0.011015752255279338\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.28308823529411764,\n \"acc_stderr\": 0.02736586113151381,\n \"acc_norm\": 0.28308823529411764,\n \"acc_norm_stderr\": 0.02736586113151381\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528047,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528047\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.17272727272727273,\n \"acc_stderr\": 0.036206918339292196,\n \"acc_norm\": 0.17272727272727273,\n \"acc_norm_stderr\": 0.036206918339292196\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.32653061224489793,\n \"acc_stderr\": 0.030021056238440317,\n \"acc_norm\": 0.32653061224489793,\n \"acc_norm_stderr\": 0.030021056238440317\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.1890547263681592,\n \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.1890547263681592,\n \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n \"acc_stderr\": 0.0317555478662992,\n \"acc_norm\": 0.21084337349397592,\n \"acc_norm_stderr\": 0.0317555478662992\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.45756971268130436,\n \"mc2_stderr\": 0.015178620872901784\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.500394632991318,\n \"acc_stderr\": 0.014052481306049512\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225172\n }\n}\n```", "repo_url": "https://huggingface.co/Felladrin/Smol-Llama-101M-Chat-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-30-13.746850.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["**/details_harness|winogrande|5_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T19-30-13.746850.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T19_30_13.746850", "path": ["results_2023-12-29T19-30-13.746850.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T19-30-13.746850.parquet"]}]}]} | 2023-12-29T19:32:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Felladrin/Smol-Llama-101M-Chat-v1
Dataset automatically created during the evaluation run of model Felladrin/Smol-Llama-101M-Chat-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T19:30:13.746850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Felladrin/Smol-Llama-101M-Chat-v1\n\n\n\nDataset automatically created during the evaluation run of model Felladrin/Smol-Llama-101M-Chat-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:30:13.746850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Felladrin/Smol-Llama-101M-Chat-v1\n\n\n\nDataset automatically created during the evaluation run of model Felladrin/Smol-Llama-101M-Chat-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:30:13.746850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Felladrin/Smol-Llama-101M-Chat-v1\n\n\n\nDataset automatically created during the evaluation run of model Felladrin/Smol-Llama-101M-Chat-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T19:30:13.746850(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
40d9984a56b56696e0d532c1bac4e3991ff4aa98 |
# Dataset Card for Evaluation run of alykassem/ds_diasum_md_mixtral
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alykassem/ds_diasum_md_mixtral](https://huggingface.co/alykassem/ds_diasum_md_mixtral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alykassem__ds_diasum_md_mixtral",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T19:29:56.671932](https://huggingface.co/datasets/open-llm-leaderboard/details_alykassem__ds_diasum_md_mixtral/blob/main/results_2023-12-29T19-29-56.671932.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.692270226414827,
"acc_stderr": 0.030762586489589964,
"acc_norm": 0.6972458162944918,
"acc_norm_stderr": 0.031355995272453654,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5572164097692784,
"mc2_stderr": 0.01463024293704983
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902272
},
"harness|hellaswag|10": {
"acc": 0.652459669388568,
"acc_stderr": 0.004752158936871874,
"acc_norm": 0.854511053574985,
"acc_norm_stderr": 0.003518725257365599
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7622641509433963,
"acc_stderr": 0.026199808807561918,
"acc_norm": 0.7622641509433963,
"acc_norm_stderr": 0.026199808807561918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6510638297872341,
"acc_stderr": 0.031158522131357783,
"acc_norm": 0.6510638297872341,
"acc_norm_stderr": 0.031158522131357783
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451207,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451207
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514587,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514587
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822033,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822033
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.031584153240477086,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.031584153240477086
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678185,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678185
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.02394672474156397,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.02394672474156397
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955293,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955293
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7983193277310925,
"acc_stderr": 0.02606431340630452,
"acc_norm": 0.7983193277310925,
"acc_norm_stderr": 0.02606431340630452
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849928,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849928
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700472,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700472
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025045,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7443946188340808,
"acc_stderr": 0.029275891003969923,
"acc_norm": 0.7443946188340808,
"acc_norm_stderr": 0.029275891003969923
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462469,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.02919980245562281,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.02919980245562281
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.030833491146281235,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.030833491146281235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8812260536398467,
"acc_stderr": 0.011569134791715655,
"acc_norm": 0.8812260536398467,
"acc_norm_stderr": 0.011569134791715655
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0230836585869842,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0230836585869842
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580425,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580425
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.023420375478296132,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.023420375478296132
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.023222756797435105,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.023222756797435105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.02175186606081587,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.02175186606081587
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.549645390070922,
"acc_stderr": 0.029680105565029043,
"acc_norm": 0.549645390070922,
"acc_norm_stderr": 0.029680105565029043
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5221642764015645,
"acc_stderr": 0.012757683047716177,
"acc_norm": 0.5221642764015645,
"acc_norm_stderr": 0.012757683047716177
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7830882352941176,
"acc_stderr": 0.02503584522771127,
"acc_norm": 0.7830882352941176,
"acc_norm_stderr": 0.02503584522771127
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.017555818091322263,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.017555818091322263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073142,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073142
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5572164097692784,
"mc2_stderr": 0.01463024293704983
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569567
},
"harness|gsm8k|5": {
"acc": 0.5322213798332069,
"acc_stderr": 0.013743857303073793
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alykassem__ds_diasum_md_mixtral | [
"region:us"
] | 2023-12-29T19:32:14+00:00 | {"pretty_name": "Evaluation run of alykassem/ds_diasum_md_mixtral", "dataset_summary": "Dataset automatically created during the evaluation run of model [alykassem/ds_diasum_md_mixtral](https://huggingface.co/alykassem/ds_diasum_md_mixtral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alykassem__ds_diasum_md_mixtral\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T19:29:56.671932](https://huggingface.co/datasets/open-llm-leaderboard/details_alykassem__ds_diasum_md_mixtral/blob/main/results_2023-12-29T19-29-56.671932.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.692270226414827,\n \"acc_stderr\": 0.030762586489589964,\n \"acc_norm\": 0.6972458162944918,\n \"acc_norm_stderr\": 0.031355995272453654,\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5572164097692784,\n \"mc2_stderr\": 0.01463024293704983\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902272\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.652459669388568,\n \"acc_stderr\": 0.004752158936871874,\n \"acc_norm\": 0.854511053574985,\n \"acc_norm_stderr\": 0.003518725257365599\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.026199808807561918,\n \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.026199808807561918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6510638297872341,\n \"acc_stderr\": 0.031158522131357783,\n \"acc_norm\": 0.6510638297872341,\n \"acc_norm_stderr\": 0.031158522131357783\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.5701754385964912,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451207,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451207\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n \"acc_stderr\": 0.021576248184514587,\n \"acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.021576248184514587\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822033,\n \"acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822033\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477086,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477086\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678185,\n \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678185\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.02394672474156397,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.02394672474156397\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955293,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955293\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7983193277310925,\n \"acc_stderr\": 0.02606431340630452,\n \"acc_norm\": 0.7983193277310925,\n \"acc_norm_stderr\": 0.02606431340630452\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849928,\n \"acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849928\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700472,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700472\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025045,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.7443946188340808,\n \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462469,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.02919980245562281,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.02919980245562281\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.030833491146281235,\n \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.030833491146281235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8812260536398467,\n \"acc_stderr\": 0.011569134791715655,\n \"acc_norm\": 0.8812260536398467,\n \"acc_norm_stderr\": 0.011569134791715655\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.0230836585869842,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.0230836585869842\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n \"acc_stderr\": 0.016598022120580425,\n \"acc_norm\": 0.43910614525139663,\n \"acc_norm_stderr\": 0.016598022120580425\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.023420375478296132,\n \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.023420375478296132\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n \"acc_stderr\": 0.023222756797435105,\n \"acc_norm\": 0.7877813504823151,\n \"acc_norm_stderr\": 0.023222756797435105\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.02175186606081587,\n \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.02175186606081587\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.549645390070922,\n \"acc_stderr\": 0.029680105565029043,\n \"acc_norm\": 0.549645390070922,\n \"acc_norm_stderr\": 0.029680105565029043\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5221642764015645,\n \"acc_stderr\": 0.012757683047716177,\n \"acc_norm\": 0.5221642764015645,\n \"acc_norm_stderr\": 0.012757683047716177\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7830882352941176,\n \"acc_stderr\": 0.02503584522771127,\n \"acc_norm\": 0.7830882352941176,\n \"acc_norm_stderr\": 0.02503584522771127\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.017555818091322263,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.017555818091322263\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073142,\n \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073142\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5572164097692784,\n \"mc2_stderr\": 0.01463024293704983\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569567\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5322213798332069,\n \"acc_stderr\": 0.013743857303073793\n }\n}\n```", "repo_url": "https://huggingface.co/alykassem/ds_diasum_md_mixtral", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-29-56.671932.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["**/details_harness|winogrande|5_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T19-29-56.671932.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T19_29_56.671932", "path": ["results_2023-12-29T19-29-56.671932.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T19-29-56.671932.parquet"]}]}]} | 2023-12-29T19:32:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alykassem/ds_diasum_md_mixtral
Dataset automatically created during the evaluation run of model alykassem/ds_diasum_md_mixtral on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T19:29:56.671932(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alykassem/ds_diasum_md_mixtral\n\n\n\nDataset automatically created during the evaluation run of model alykassem/ds_diasum_md_mixtral on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:29:56.671932(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alykassem/ds_diasum_md_mixtral\n\n\n\nDataset automatically created during the evaluation run of model alykassem/ds_diasum_md_mixtral on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:29:56.671932(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of alykassem/ds_diasum_md_mixtral\n\n\n\nDataset automatically created during the evaluation run of model alykassem/ds_diasum_md_mixtral on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T19:29:56.671932(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
0e7c099f3d8620d0d5f70469aec91538ed6f0715 |
# Dataset Card for Evaluation run of dillfrescott/trinity-medium
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dillfrescott/trinity-medium](https://huggingface.co/dillfrescott/trinity-medium) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dillfrescott__trinity-medium",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T19:31:05.351110](https://huggingface.co/datasets/open-llm-leaderboard/details_dillfrescott__trinity-medium/blob/main/results_2023-12-29T19-31-05.351110.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6543513981322463,
"acc_stderr": 0.03200522597754596,
"acc_norm": 0.65517466197835,
"acc_norm_stderr": 0.03265417586520206,
"mc1": 0.5630354957160343,
"mc1_stderr": 0.017363844503195957,
"mc2": 0.6954134254414035,
"mc2_stderr": 0.015047304382402624
},
"harness|arc:challenge|25": {
"acc": 0.6868600682593856,
"acc_stderr": 0.013552671543623497,
"acc_norm": 0.7150170648464164,
"acc_norm_stderr": 0.013191348179838795
},
"harness|hellaswag|10": {
"acc": 0.6963752240589524,
"acc_stderr": 0.004588827958775116,
"acc_norm": 0.869946225851424,
"acc_norm_stderr": 0.0033567515689037672
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.035506839891655796,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.035506839891655796
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.022755204959542946,
"acc_norm": 0.8,
"acc_norm_stderr": 0.022755204959542946
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608311,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608311
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4770949720670391,
"acc_stderr": 0.016704945740326188,
"acc_norm": 0.4770949720670391,
"acc_norm_stderr": 0.016704945740326188
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984806,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984806
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214963,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214963
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5630354957160343,
"mc1_stderr": 0.017363844503195957,
"mc2": 0.6954134254414035,
"mc2_stderr": 0.015047304382402624
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019816
},
"harness|gsm8k|5": {
"acc": 0.6504927975739196,
"acc_stderr": 0.013133836511705991
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_dillfrescott__trinity-medium | [
"region:us"
] | 2023-12-29T19:33:21+00:00 | {"pretty_name": "Evaluation run of dillfrescott/trinity-medium", "dataset_summary": "Dataset automatically created during the evaluation run of model [dillfrescott/trinity-medium](https://huggingface.co/dillfrescott/trinity-medium) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dillfrescott__trinity-medium\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T19:31:05.351110](https://huggingface.co/datasets/open-llm-leaderboard/details_dillfrescott__trinity-medium/blob/main/results_2023-12-29T19-31-05.351110.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543513981322463,\n \"acc_stderr\": 0.03200522597754596,\n \"acc_norm\": 0.65517466197835,\n \"acc_norm_stderr\": 0.03265417586520206,\n \"mc1\": 0.5630354957160343,\n \"mc1_stderr\": 0.017363844503195957,\n \"mc2\": 0.6954134254414035,\n \"mc2_stderr\": 0.015047304382402624\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623497,\n \"acc_norm\": 0.7150170648464164,\n \"acc_norm_stderr\": 0.013191348179838795\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6963752240589524,\n \"acc_stderr\": 0.004588827958775116,\n \"acc_norm\": 0.869946225851424,\n \"acc_norm_stderr\": 0.0033567515689037672\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.035506839891655796,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.035506839891655796\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.022755204959542946,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.022755204959542946\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073332,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608311,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608311\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4770949720670391,\n \"acc_stderr\": 0.016704945740326188,\n \"acc_norm\": 0.4770949720670391,\n \"acc_norm_stderr\": 0.016704945740326188\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984806,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984806\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214963,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214963\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5630354957160343,\n \"mc1_stderr\": 0.017363844503195957,\n \"mc2\": 0.6954134254414035,\n \"mc2_stderr\": 0.015047304382402624\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019816\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6504927975739196,\n \"acc_stderr\": 0.013133836511705991\n }\n}\n```", "repo_url": "https://huggingface.co/dillfrescott/trinity-medium", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-31-05.351110.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["**/details_harness|winogrande|5_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T19-31-05.351110.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T19_31_05.351110", "path": ["results_2023-12-29T19-31-05.351110.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T19-31-05.351110.parquet"]}]}]} | 2023-12-29T19:33:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of dillfrescott/trinity-medium
Dataset automatically created during the evaluation run of model dillfrescott/trinity-medium on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T19:31:05.351110(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of dillfrescott/trinity-medium\n\n\n\nDataset automatically created during the evaluation run of model dillfrescott/trinity-medium on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:31:05.351110(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of dillfrescott/trinity-medium\n\n\n\nDataset automatically created during the evaluation run of model dillfrescott/trinity-medium on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:31:05.351110(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dillfrescott/trinity-medium\n\n\n\nDataset automatically created during the evaluation run of model dillfrescott/trinity-medium on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T19:31:05.351110(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
7c598f345251569df8931ec95e032f4962d83fce |
# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-Mistral-5000-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/finetuned-Mistral-5000-v1.0](https://huggingface.co/abdulrahman-nuzha/finetuned-Mistral-5000-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-Mistral-5000-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T19:44:16.561638](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-Mistral-5000-v1.0/blob/main/results_2023-12-29T19-44-16.561638.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6135247894547725,
"acc_stderr": 0.03300396491195961,
"acc_norm": 0.6200318264788953,
"acc_norm_stderr": 0.03368910509040065,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.41174645037365515,
"mc2_stderr": 0.01414010986158355
},
"harness|arc:challenge|25": {
"acc": 0.5537542662116041,
"acc_stderr": 0.014526705548539982,
"acc_norm": 0.5989761092150171,
"acc_norm_stderr": 0.014322255790719867
},
"harness|hellaswag|10": {
"acc": 0.6183031268671579,
"acc_stderr": 0.004848099661619693,
"acc_norm": 0.8237402907787293,
"acc_norm_stderr": 0.003802622341529015
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752042,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752042
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365904,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365904
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139746,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139746
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.01612927102509987,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.01612927102509987
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.02845882099146031,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.02845882099146031
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333567,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310263,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.026624152478845853,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.026624152478845853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4282920469361147,
"acc_stderr": 0.012638223880313167,
"acc_norm": 0.4282920469361147,
"acc_norm_stderr": 0.012638223880313167
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.01970687580408563,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.01970687580408563
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484375,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484375
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623326,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623326
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.41174645037365515,
"mc2_stderr": 0.01414010986158355
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209403
},
"harness|gsm8k|5": {
"acc": 0.310841546626232,
"acc_stderr": 0.012748860507777718
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-Mistral-5000-v1.0 | [
"region:us"
] | 2023-12-29T19:46:32+00:00 | {"pretty_name": "Evaluation run of abdulrahman-nuzha/finetuned-Mistral-5000-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/finetuned-Mistral-5000-v1.0](https://huggingface.co/abdulrahman-nuzha/finetuned-Mistral-5000-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-Mistral-5000-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T19:44:16.561638](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-Mistral-5000-v1.0/blob/main/results_2023-12-29T19-44-16.561638.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6135247894547725,\n \"acc_stderr\": 0.03300396491195961,\n \"acc_norm\": 0.6200318264788953,\n \"acc_norm_stderr\": 0.03368910509040065,\n \"mc1\": 0.26438188494492043,\n \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.41174645037365515,\n \"mc2_stderr\": 0.01414010986158355\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5537542662116041,\n \"acc_stderr\": 0.014526705548539982,\n \"acc_norm\": 0.5989761092150171,\n \"acc_norm_stderr\": 0.014322255790719867\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6183031268671579,\n \"acc_stderr\": 0.004848099661619693,\n \"acc_norm\": 0.8237402907787293,\n \"acc_norm_stderr\": 0.003802622341529015\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752042,\n \"acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752042\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365904,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365904\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139746,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139746\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n \"acc_stderr\": 0.01612927102509987,\n \"acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.01612927102509987\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608043,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7426160337552743,\n \"acc_stderr\": 0.02845882099146031,\n \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.02845882099146031\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n \"acc_stderr\": 0.014836205167333567,\n \"acc_norm\": 0.7790549169859514,\n \"acc_norm_stderr\": 0.014836205167333567\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n \"acc_stderr\": 0.016051419760310263,\n \"acc_norm\": 0.35977653631284917,\n \"acc_norm_stderr\": 0.016051419760310263\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.026624152478845853,\n \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.026624152478845853\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4282920469361147,\n \"acc_stderr\": 0.012638223880313167,\n \"acc_norm\": 0.4282920469361147,\n \"acc_norm_stderr\": 0.012638223880313167\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.01970687580408563,\n \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.01970687580408563\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484375,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484375\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623326,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.41174645037365515,\n \"mc2_stderr\": 0.01414010986158355\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209403\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.310841546626232,\n \"acc_stderr\": 0.012748860507777718\n }\n}\n```", "repo_url": "https://huggingface.co/abdulrahman-nuzha/finetuned-Mistral-5000-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T19-44-16.561638.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["**/details_harness|winogrande|5_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T19-44-16.561638.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T19_44_16.561638", "path": ["results_2023-12-29T19-44-16.561638.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T19-44-16.561638.parquet"]}]}]} | 2023-12-29T19:46:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-Mistral-5000-v1.0
Dataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-Mistral-5000-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T19:44:16.561638(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-Mistral-5000-v1.0\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-Mistral-5000-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:44:16.561638(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-Mistral-5000-v1.0\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-Mistral-5000-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T19:44:16.561638(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-Mistral-5000-v1.0\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-Mistral-5000-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T19:44:16.561638(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
d145465f00011fd3e1b00a0443920138eec4fe78 |
This dataset is a simple collection of 182,079 documents found in the [Founder's Online Metadata](https://founders.archives.gov/Metadata/founders-online-metadata.json)
with the matching content from the api.
This was just a quick weekend project idea and so I haven't spent much time on it. It should not be used without cleanup.
I have no affiliation with [NHPRC](https://www.archives.gov/nhprc) or the [University of Virginia Press](https://www.upress.virginia.edu),
I merely collected the data for my own personal interest.
## Quick notes
I had never attempted making my own HF datasets before so I went through a few attempts to upload and I am quite sure I did it wrong.
I did attempt an initial fine-tuning run using it as is anyway, which took a few weeks, but it wasn't worth it and I lack the time to do it right.
The intent was to be cleaned up and structured for training or finetuning a model, but have not had the time.
## Problem Documents
The data currently includes some sort of composite entries that need to be removed,
such as duplicative monthly journal entries, where there exists a document for each day already.
These composite documents exist in the original API, so I had no way to filter them in this initial pass.
Some digging has me concerned that some of those composite entries may also include copyrighted notes not available under the CC use,
so I would be wary of training before they are removed anyway.
There are also many documents that are likely not suitable as is for training,
where they include short hand and incomplete notes from journals, and might be saner to remove them.
Most documents also have very inconsistant spacing indententation, and line breaks, that likely should be removed depending on use case.
## Missing Documents
This is missing at least 2548 documents from the John Jay collection,
because the API does not yet have those documents.
An additional 281 documents from other collections were automatically excluded because the API returned no content.
# From the [website:](https://www.archives.gov/open/nhprc/dataset-founders-online)
### What is the data?
This dataset provides information about the more than 180,000 documents published as part of [Founders Online](founders.archives.gov).
Founders Online includes the correspondence and writings of John Adams, Benjamin Franklin, Alexander Hamilton, Thomas Jefferson, James Madison and George Washington.
### Is this dataset in the public domain?
As the work of the [University of Virginia Press](https://www.upress.virginia.edu), this data is released for non-commerical use and by attribution.
| RandomThinker42/FoundersArchives | [
"license:cc-by-nc-4.0",
"region:us"
] | 2023-12-29T20:08:52+00:00 | {"license": "cc-by-nc-4.0", "dataset_info": {"features": [{"name": "title", "dtype": "string"}, {"name": "permalink", "dtype": "string"}, {"name": "project", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "recipients", "dtype": "string"}, {"name": "date-from", "dtype": "timestamp[s]"}, {"name": "date-to", "dtype": "timestamp[s]"}, {"name": "content", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 452725661, "num_examples": 181586}], "download_size": 241059447, "dataset_size": 452725661}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-11T00:07:39+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #region-us
|
This dataset is a simple collection of 182,079 documents found in the Founder's Online Metadata
with the matching content from the api.
This was just a quick weekend project idea and so I haven't spent much time on it. It should not be used without cleanup.
I have no affiliation with NHPRC or the University of Virginia Press,
I merely collected the data for my own personal interest.
## Quick notes
I had never attempted making my own HF datasets before so I went through a few attempts to upload and I am quite sure I did it wrong.
I did attempt an initial fine-tuning run using it as is anyway, which took a few weeks, but it wasn't worth it and I lack the time to do it right.
The intent was to be cleaned up and structured for training or finetuning a model, but have not had the time.
## Problem Documents
The data currently includes some sort of composite entries that need to be removed,
such as duplicative monthly journal entries, where there exists a document for each day already.
These composite documents exist in the original API, so I had no way to filter them in this initial pass.
Some digging has me concerned that some of those composite entries may also include copyrighted notes not available under the CC use,
so I would be wary of training before they are removed anyway.
There are also many documents that are likely not suitable as is for training,
where they include short hand and incomplete notes from journals, and might be saner to remove them.
Most documents also have very inconsistant spacing indententation, and line breaks, that likely should be removed depending on use case.
## Missing Documents
This is missing at least 2548 documents from the John Jay collection,
because the API does not yet have those documents.
An additional 281 documents from other collections were automatically excluded because the API returned no content.
# From the website:
### What is the data?
This dataset provides information about the more than 180,000 documents published as part of Founders Online.
Founders Online includes the correspondence and writings of John Adams, Benjamin Franklin, Alexander Hamilton, Thomas Jefferson, James Madison and George Washington.
### Is this dataset in the public domain?
As the work of the University of Virginia Press, this data is released for non-commerical use and by attribution.
| [
"## Quick notes\n\nI had never attempted making my own HF datasets before so I went through a few attempts to upload and I am quite sure I did it wrong. \n\nI did attempt an initial fine-tuning run using it as is anyway, which took a few weeks, but it wasn't worth it and I lack the time to do it right. \n\nThe intent was to be cleaned up and structured for training or finetuning a model, but have not had the time.",
"## Problem Documents\n\nThe data currently includes some sort of composite entries that need to be removed, \nsuch as duplicative monthly journal entries, where there exists a document for each day already.\n\nThese composite documents exist in the original API, so I had no way to filter them in this initial pass.\n\nSome digging has me concerned that some of those composite entries may also include copyrighted notes not available under the CC use,\nso I would be wary of training before they are removed anyway.\n\nThere are also many documents that are likely not suitable as is for training, \nwhere they include short hand and incomplete notes from journals, and might be saner to remove them.\n\nMost documents also have very inconsistant spacing indententation, and line breaks, that likely should be removed depending on use case.",
"## Missing Documents\n\nThis is missing at least 2548 documents from the John Jay collection, \nbecause the API does not yet have those documents. \n\nAn additional 281 documents from other collections were automatically excluded because the API returned no content.",
"# From the website:",
"### What is the data?\n\nThis dataset provides information about the more than 180,000 documents published as part of Founders Online. \n\nFounders Online includes the correspondence and writings of John Adams, Benjamin Franklin, Alexander Hamilton, Thomas Jefferson, James Madison and George Washington.",
"### Is this dataset in the public domain?\n\nAs the work of the University of Virginia Press, this data is released for non-commerical use and by attribution."
] | [
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"## Quick notes\n\nI had never attempted making my own HF datasets before so I went through a few attempts to upload and I am quite sure I did it wrong. \n\nI did attempt an initial fine-tuning run using it as is anyway, which took a few weeks, but it wasn't worth it and I lack the time to do it right. \n\nThe intent was to be cleaned up and structured for training or finetuning a model, but have not had the time.",
"## Problem Documents\n\nThe data currently includes some sort of composite entries that need to be removed, \nsuch as duplicative monthly journal entries, where there exists a document for each day already.\n\nThese composite documents exist in the original API, so I had no way to filter them in this initial pass.\n\nSome digging has me concerned that some of those composite entries may also include copyrighted notes not available under the CC use,\nso I would be wary of training before they are removed anyway.\n\nThere are also many documents that are likely not suitable as is for training, \nwhere they include short hand and incomplete notes from journals, and might be saner to remove them.\n\nMost documents also have very inconsistant spacing indententation, and line breaks, that likely should be removed depending on use case.",
"## Missing Documents\n\nThis is missing at least 2548 documents from the John Jay collection, \nbecause the API does not yet have those documents. \n\nAn additional 281 documents from other collections were automatically excluded because the API returned no content.",
"# From the website:",
"### What is the data?\n\nThis dataset provides information about the more than 180,000 documents published as part of Founders Online. \n\nFounders Online includes the correspondence and writings of John Adams, Benjamin Franklin, Alexander Hamilton, Thomas Jefferson, James Madison and George Washington.",
"### Is this dataset in the public domain?\n\nAs the work of the University of Virginia Press, this data is released for non-commerical use and by attribution."
] | [
17,
103,
176,
49,
5,
56,
37
] | [
"passage: TAGS\n#license-cc-by-nc-4.0 #region-us \n## Quick notes\n\nI had never attempted making my own HF datasets before so I went through a few attempts to upload and I am quite sure I did it wrong. \n\nI did attempt an initial fine-tuning run using it as is anyway, which took a few weeks, but it wasn't worth it and I lack the time to do it right. \n\nThe intent was to be cleaned up and structured for training or finetuning a model, but have not had the time.## Problem Documents\n\nThe data currently includes some sort of composite entries that need to be removed, \nsuch as duplicative monthly journal entries, where there exists a document for each day already.\n\nThese composite documents exist in the original API, so I had no way to filter them in this initial pass.\n\nSome digging has me concerned that some of those composite entries may also include copyrighted notes not available under the CC use,\nso I would be wary of training before they are removed anyway.\n\nThere are also many documents that are likely not suitable as is for training, \nwhere they include short hand and incomplete notes from journals, and might be saner to remove them.\n\nMost documents also have very inconsistant spacing indententation, and line breaks, that likely should be removed depending on use case.## Missing Documents\n\nThis is missing at least 2548 documents from the John Jay collection, \nbecause the API does not yet have those documents. \n\nAn additional 281 documents from other collections were automatically excluded because the API returned no content.# From the website:### What is the data?\n\nThis dataset provides information about the more than 180,000 documents published as part of Founders Online. \n\nFounders Online includes the correspondence and writings of John Adams, Benjamin Franklin, Alexander Hamilton, Thomas Jefferson, James Madison and George Washington.### Is this dataset in the public domain?\n\nAs the work of the University of Virginia Press, this data is released for non-commerical use and by attribution."
] |
3316a09be6f0b5624fd83dd8c70f2b1d1a00f247 |
# This is a meditation dataset generated with gpt-3.5-turbo
I made the data by generating a list of 85 meditation intentions (combinations of goals and themes) in ChatGPT. For example, goal: `develop compassion`, theme: `cultivating a non-judgmental attitude`.
Then, I prompted `gpt-3.5-turbo` to create three meditations for each intention with a temperature of 1.1:
```You are a secular buddhist monk. Give me a daily meditation to {goal} with a focus on {focus}. Do not include any introductory text.```
[Details here](https://medium.com/@berdaniera/generating-synthetic-training-data-with-llms-eb987eb3629a)
### Risks:
A spot check looks pretty good, but I haven't read all of them.
### License:
You can share and adapt this data with attribution under the cc-by-4.0 license.
## Contact:
Message me if you have questions! | berdaniera/meditation | [
"task_categories:text-generation",
"language:en",
"license:cc-by-4.0",
"region:us"
] | 2023-12-29T20:10:19+00:00 | {"language": ["en"], "license": "cc-by-4.0", "task_categories": ["text-generation"]} | 2024-01-19T15:49:07+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #language-English #license-cc-by-4.0 #region-us
|
# This is a meditation dataset generated with gpt-3.5-turbo
I made the data by generating a list of 85 meditation intentions (combinations of goals and themes) in ChatGPT. For example, goal: 'develop compassion', theme: 'cultivating a non-judgmental attitude'.
Then, I prompted 'gpt-3.5-turbo' to create three meditations for each intention with a temperature of 1.1:
Details here
### Risks:
A spot check looks pretty good, but I haven't read all of them.
### License:
You can share and adapt this data with attribution under the cc-by-4.0 license.
## Contact:
Message me if you have questions! | [
"# This is a meditation dataset generated with gpt-3.5-turbo\n\nI made the data by generating a list of 85 meditation intentions (combinations of goals and themes) in ChatGPT. For example, goal: 'develop compassion', theme: 'cultivating a non-judgmental attitude'.\n\nThen, I prompted 'gpt-3.5-turbo' to create three meditations for each intention with a temperature of 1.1:\n\n\nDetails here",
"### Risks:\nA spot check looks pretty good, but I haven't read all of them.",
"### License:\nYou can share and adapt this data with attribution under the cc-by-4.0 license.",
"## Contact:\nMessage me if you have questions!"
] | [
"TAGS\n#task_categories-text-generation #language-English #license-cc-by-4.0 #region-us \n",
"# This is a meditation dataset generated with gpt-3.5-turbo\n\nI made the data by generating a list of 85 meditation intentions (combinations of goals and themes) in ChatGPT. For example, goal: 'develop compassion', theme: 'cultivating a non-judgmental attitude'.\n\nThen, I prompted 'gpt-3.5-turbo' to create three meditations for each intention with a temperature of 1.1:\n\n\nDetails here",
"### Risks:\nA spot check looks pretty good, but I haven't read all of them.",
"### License:\nYou can share and adapt this data with attribution under the cc-by-4.0 license.",
"## Contact:\nMessage me if you have questions!"
] | [
30,
103,
22,
24,
11
] | [
"passage: TAGS\n#task_categories-text-generation #language-English #license-cc-by-4.0 #region-us \n# This is a meditation dataset generated with gpt-3.5-turbo\n\nI made the data by generating a list of 85 meditation intentions (combinations of goals and themes) in ChatGPT. For example, goal: 'develop compassion', theme: 'cultivating a non-judgmental attitude'.\n\nThen, I prompted 'gpt-3.5-turbo' to create three meditations for each intention with a temperature of 1.1:\n\n\nDetails here### Risks:\nA spot check looks pretty good, but I haven't read all of them.### License:\nYou can share and adapt this data with attribution under the cc-by-4.0 license.## Contact:\nMessage me if you have questions!"
] |
b549472b0f976d9351e66f5e03be9dd26f568207 |
# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-13B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeonsworld/CarbonVillain-en-13B-v1](https://huggingface.co/jeonsworld/CarbonVillain-en-13B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-13B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T20:14:53.981182](https://huggingface.co/datasets/open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-13B-v1/blob/main/results_2023-12-29T20-14-53.981182.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6677851094474622,
"acc_stderr": 0.031647346301320364,
"acc_norm": 0.6687652386109932,
"acc_norm_stderr": 0.032290288467975714,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7197651592692368,
"mc2_stderr": 0.014984462732010536
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266125
},
"harness|hellaswag|10": {
"acc": 0.7148974307906791,
"acc_stderr": 0.00450540617660685,
"acc_norm": 0.8845847440748855,
"acc_norm_stderr": 0.0031886940284536315
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.02575094967813038,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.02575094967813038
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514587,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514587
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03011768892950357,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03011768892950357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.016376966142610073,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.016376966142610073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4934810951760104,
"acc_stderr": 0.012769150688867503,
"acc_norm": 0.4934810951760104,
"acc_norm_stderr": 0.012769150688867503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7197651592692368,
"mc2_stderr": 0.014984462732010536
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.6429112964366944,
"acc_stderr": 0.013197931775445206
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-13B-v1 | [
"region:us"
] | 2023-12-29T20:17:12+00:00 | {"pretty_name": "Evaluation run of jeonsworld/CarbonVillain-en-13B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [jeonsworld/CarbonVillain-en-13B-v1](https://huggingface.co/jeonsworld/CarbonVillain-en-13B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-13B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T20:14:53.981182](https://huggingface.co/datasets/open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-13B-v1/blob/main/results_2023-12-29T20-14-53.981182.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6677851094474622,\n \"acc_stderr\": 0.031647346301320364,\n \"acc_norm\": 0.6687652386109932,\n \"acc_norm_stderr\": 0.032290288467975714,\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7197651592692368,\n \"mc2_stderr\": 0.014984462732010536\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266125\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7148974307906791,\n \"acc_stderr\": 0.00450540617660685,\n \"acc_norm\": 0.8845847440748855,\n \"acc_norm_stderr\": 0.0031886940284536315\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5026455026455027,\n \"acc_stderr\": 0.02575094967813038,\n \"acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.02575094967813038\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n \"acc_stderr\": 0.021576248184514587,\n \"acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.021576248184514587\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03011768892950357,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03011768892950357\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n \"acc_stderr\": 0.016376966142610073,\n \"acc_norm\": 0.39888268156424583,\n \"acc_norm_stderr\": 0.016376966142610073\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7197651592692368,\n \"mc2_stderr\": 0.014984462732010536\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6429112964366944,\n \"acc_stderr\": 0.013197931775445206\n }\n}\n```", "repo_url": "https://huggingface.co/jeonsworld/CarbonVillain-en-13B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|arc:challenge|25_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|gsm8k|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hellaswag|10_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T20-14-53.981182.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["**/details_harness|winogrande|5_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T20-14-53.981182.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T20_14_53.981182", "path": ["results_2023-12-29T20-14-53.981182.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T20-14-53.981182.parquet"]}]}]} | 2023-12-29T20:17:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-13B-v1
Dataset automatically created during the evaluation run of model jeonsworld/CarbonVillain-en-13B-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T20:14:53.981182(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-13B-v1\n\n\n\nDataset automatically created during the evaluation run of model jeonsworld/CarbonVillain-en-13B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T20:14:53.981182(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-13B-v1\n\n\n\nDataset automatically created during the evaluation run of model jeonsworld/CarbonVillain-en-13B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T20:14:53.981182(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-13B-v1\n\n\n\nDataset automatically created during the evaluation run of model jeonsworld/CarbonVillain-en-13B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T20:14:53.981182(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
31ffe59588e05d586ba8d4a6e6d47cadec2300d6 |
# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-10.7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeonsworld/CarbonVillain-en-10.7B-v1](https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T20:15:36.884484](https://huggingface.co/datasets/open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v1/blob/main/results_2023-12-29T20-15-36.884484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6677851094474622,
"acc_stderr": 0.031647346301320364,
"acc_norm": 0.6687652386109932,
"acc_norm_stderr": 0.032290288467975714,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7197651592692368,
"mc2_stderr": 0.014984462732010536
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266125
},
"harness|hellaswag|10": {
"acc": 0.7148974307906791,
"acc_stderr": 0.00450540617660685,
"acc_norm": 0.8845847440748855,
"acc_norm_stderr": 0.0031886940284536315
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.02575094967813038,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.02575094967813038
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514587,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514587
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03011768892950357,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03011768892950357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.016376966142610073,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.016376966142610073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4934810951760104,
"acc_stderr": 0.012769150688867503,
"acc_norm": 0.4934810951760104,
"acc_norm_stderr": 0.012769150688867503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7197651592692368,
"mc2_stderr": 0.014984462732010536
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.6429112964366944,
"acc_stderr": 0.013197931775445206
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v1 | [
"region:us"
] | 2023-12-29T20:17:56+00:00 | {"pretty_name": "Evaluation run of jeonsworld/CarbonVillain-en-10.7B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [jeonsworld/CarbonVillain-en-10.7B-v1](https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T20:15:36.884484](https://huggingface.co/datasets/open-llm-leaderboard/details_jeonsworld__CarbonVillain-en-10.7B-v1/blob/main/results_2023-12-29T20-15-36.884484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6677851094474622,\n \"acc_stderr\": 0.031647346301320364,\n \"acc_norm\": 0.6687652386109932,\n \"acc_norm_stderr\": 0.032290288467975714,\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7197651592692368,\n \"mc2_stderr\": 0.014984462732010536\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266125\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7148974307906791,\n \"acc_stderr\": 0.00450540617660685,\n \"acc_norm\": 0.8845847440748855,\n \"acc_norm_stderr\": 0.0031886940284536315\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5026455026455027,\n \"acc_stderr\": 0.02575094967813038,\n \"acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.02575094967813038\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n \"acc_stderr\": 0.021576248184514587,\n \"acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.021576248184514587\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03011768892950357,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03011768892950357\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n \"acc_stderr\": 0.016376966142610073,\n \"acc_norm\": 0.39888268156424583,\n \"acc_norm_stderr\": 0.016376966142610073\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7197651592692368,\n \"mc2_stderr\": 0.014984462732010536\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6429112964366944,\n \"acc_stderr\": 0.013197931775445206\n }\n}\n```", "repo_url": "https://huggingface.co/jeonsworld/CarbonVillain-en-10.7B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|arc:challenge|25_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|gsm8k|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hellaswag|10_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T20-15-36.884484.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["**/details_harness|winogrande|5_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T20-15-36.884484.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T20_15_36.884484", "path": ["results_2023-12-29T20-15-36.884484.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T20-15-36.884484.parquet"]}]}]} | 2023-12-29T20:18:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-10.7B-v1
Dataset automatically created during the evaluation run of model jeonsworld/CarbonVillain-en-10.7B-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T20:15:36.884484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model jeonsworld/CarbonVillain-en-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T20:15:36.884484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model jeonsworld/CarbonVillain-en-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T20:15:36.884484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jeonsworld/CarbonVillain-en-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model jeonsworld/CarbonVillain-en-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T20:15:36.884484(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
19daee89706dce07c4714284cbc704e716ce82ec |
# Dataset Card for Evaluation run of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T20:19:42.566398](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T/blob/main/results_2023-12-29T20-19-42.566398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.265691244274486,
"acc_stderr": 0.031066770980303738,
"acc_norm": 0.26755149869038447,
"acc_norm_stderr": 0.031835502327294145,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.3732177557725045,
"mc2_stderr": 0.013798981933202878
},
"harness|arc:challenge|25": {
"acc": 0.3046075085324232,
"acc_stderr": 0.01344952210993249,
"acc_norm": 0.3387372013651877,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.4493128858793069,
"acc_stderr": 0.00496407587012034,
"acc_norm": 0.6030671181039634,
"acc_norm_stderr": 0.004882619484166595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816503,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.11851851851851852,
"acc_stderr": 0.027922050250639055,
"acc_norm": 0.11851851851851852,
"acc_norm_stderr": 0.027922050250639055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.029674167520101456,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.029674167520101456
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.02648035717989569,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.02648035717989569
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.03646758875075566,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.03646758875075566
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776578,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776578
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114485,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.0298575156733864,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.0298575156733864
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.03074890536390988,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.03074890536390988
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.27692307692307694,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.27692307692307694,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341933,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341933
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24220183486238533,
"acc_stderr": 0.018368176306598618,
"acc_norm": 0.24220183486238533,
"acc_norm_stderr": 0.018368176306598618
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02977177522814565,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02977177522814565
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.027303484599069422,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.027303484599069422
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591204,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591204
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.015696008563807096,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.015696008563807096
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2346368715083799,
"acc_stderr": 0.014173044098303654,
"acc_norm": 0.2346368715083799,
"acc_norm_stderr": 0.014173044098303654
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.02484792135806396,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.02484792135806396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2242503259452412,
"acc_stderr": 0.010652615824906172,
"acc_norm": 0.2242503259452412,
"acc_norm_stderr": 0.010652615824906172
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.029289413409403196,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.029289413409403196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528044,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528044
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.022401787435256386,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.022401787435256386
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.035915667978246635,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.035915667978246635
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.03446296217088426,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.03446296217088426
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.3732177557725045,
"mc2_stderr": 0.013798981933202878
},
"harness|winogrande|5": {
"acc": 0.5951065509076559,
"acc_stderr": 0.013795927003124934
},
"harness|gsm8k|5": {
"acc": 0.014404852160727824,
"acc_stderr": 0.0032820559171369596
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T | [
"region:us"
] | 2023-12-29T20:21:31+00:00 | {"pretty_name": "Evaluation run of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T", "dataset_summary": "Dataset automatically created during the evaluation run of model [TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T20:19:42.566398](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T/blob/main/results_2023-12-29T20-19-42.566398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.265691244274486,\n \"acc_stderr\": 0.031066770980303738,\n \"acc_norm\": 0.26755149869038447,\n \"acc_norm_stderr\": 0.031835502327294145,\n \"mc1\": 0.21909424724602203,\n \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.3732177557725045,\n \"mc2_stderr\": 0.013798981933202878\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3046075085324232,\n \"acc_stderr\": 0.01344952210993249,\n \"acc_norm\": 0.3387372013651877,\n \"acc_norm_stderr\": 0.01383056892797433\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4493128858793069,\n \"acc_stderr\": 0.00496407587012034,\n \"acc_norm\": 0.6030671181039634,\n \"acc_norm_stderr\": 0.004882619484166595\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816503,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.11851851851851852,\n \"acc_stderr\": 0.027922050250639055,\n \"acc_norm\": 0.11851851851851852,\n \"acc_norm_stderr\": 0.027922050250639055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.029674167520101456,\n \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.029674167520101456\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.02648035717989569,\n \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.02648035717989569\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.02989614568209546,\n \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.02989614568209546\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.03646758875075566,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.03646758875075566\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776578,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776578\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114485,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114485\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.0298575156733864,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.0298575156733864\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.03074890536390988,\n \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.03074890536390988\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.27692307692307694,\n \"acc_stderr\": 0.022688042352424994,\n \"acc_norm\": 0.27692307692307694,\n \"acc_norm_stderr\": 0.022688042352424994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341933,\n \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341933\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24220183486238533,\n \"acc_stderr\": 0.018368176306598618,\n \"acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.018368176306598618\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02977177522814565,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02977177522814565\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.22784810126582278,\n \"acc_stderr\": 0.027303484599069422,\n \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.027303484599069422\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.35874439461883406,\n \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591204,\n \"acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591204\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n \"acc_stderr\": 0.015696008563807096,\n \"acc_norm\": 0.26053639846743293,\n \"acc_norm_stderr\": 0.015696008563807096\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n \"acc_stderr\": 0.014173044098303654,\n \"acc_norm\": 0.2346368715083799,\n \"acc_norm_stderr\": 0.014173044098303654\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.024954184324879912,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.024954184324879912\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.2797427652733119,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22340425531914893,\n \"acc_stderr\": 0.02484792135806396,\n \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.02484792135806396\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2242503259452412,\n \"acc_stderr\": 0.010652615824906172,\n \"acc_norm\": 0.2242503259452412,\n \"acc_norm_stderr\": 0.010652615824906172\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.029289413409403196,\n \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.029289413409403196\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528044,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528044\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.14285714285714285,\n \"acc_stderr\": 0.022401787435256386,\n \"acc_norm\": 0.14285714285714285,\n \"acc_norm_stderr\": 0.022401787435256386\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n \"acc_stderr\": 0.035915667978246635,\n \"acc_norm\": 0.3072289156626506,\n \"acc_norm_stderr\": 0.035915667978246635\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.03446296217088426,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.03446296217088426\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.3732177557725045,\n \"mc2_stderr\": 0.013798981933202878\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5951065509076559,\n \"acc_stderr\": 0.013795927003124934\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \"acc_stderr\": 0.0032820559171369596\n }\n}\n```", "repo_url": "https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|arc:challenge|25_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|gsm8k|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hellaswag|10_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T20-19-42.566398.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["**/details_harness|winogrande|5_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T20-19-42.566398.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T20_19_42.566398", "path": ["results_2023-12-29T20-19-42.566398.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T20-19-42.566398.parquet"]}]}]} | 2023-12-29T20:22:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
Dataset automatically created during the evaluation run of model TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T20:19:42.566398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T\n\n\n\nDataset automatically created during the evaluation run of model TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T20:19:42.566398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T\n\n\n\nDataset automatically created during the evaluation run of model TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T20:19:42.566398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
207,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T\n\n\n\nDataset automatically created during the evaluation run of model TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T20:19:42.566398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
a8f06a082e5e9ff813dca5f47ce6d26ae9eb7460 | # Dataset Card for "alpaca-code-flat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sordonia/alpaca-code-flat | [
"region:us"
] | 2023-12-29T20:27:00+00:00 | {"dataset_info": {"features": [{"name": "task_source", "dtype": "string"}, {"name": "task_name", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}, {"name": "split", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 45357255.85486926, "num_examples": 77642}], "download_size": 19492313, "dataset_size": 45357255.85486926}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-29T21:05:26+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "alpaca-code-flat"
More Information needed | [
"# Dataset Card for \"alpaca-code-flat\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"alpaca-code-flat\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"alpaca-code-flat\"\n\nMore Information needed"
] |
ad609b9e75372fc82f770d36909a66913f2e6f5f |
# Dataset Card for Evaluation run of sethuiyer/SynthIQ-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sethuiyer/SynthIQ-7b](https://huggingface.co/sethuiyer/SynthIQ-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sethuiyer__SynthIQ-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T20:56:49.534074](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__SynthIQ-7b/blob/main/results_2023-12-29T20-56-49.534074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6496837874821432,
"acc_stderr": 0.031931649395756406,
"acc_norm": 0.6512485789591291,
"acc_norm_stderr": 0.03256964811222345,
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023515,
"mc2": 0.570003601485072,
"mc2_stderr": 0.015572869395398968
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893449,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.013855831287497724
},
"harness|hellaswag|10": {
"acc": 0.6709818761202948,
"acc_stderr": 0.004688963175758135,
"acc_norm": 0.858195578570006,
"acc_norm_stderr": 0.003481364840770978
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113728,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113728
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634325,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634325
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.01653682964899711,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.01653682964899711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045708,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045708
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.02448448716291397,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.02448448716291397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023515,
"mc2": 0.570003601485072,
"mc2_stderr": 0.015572869395398968
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.01150895769072275
},
"harness|gsm8k|5": {
"acc": 0.640636846095527,
"acc_stderr": 0.013216456309851523
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sethuiyer__SynthIQ-7b | [
"region:us"
] | 2023-12-29T20:59:06+00:00 | {"pretty_name": "Evaluation run of sethuiyer/SynthIQ-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [sethuiyer/SynthIQ-7b](https://huggingface.co/sethuiyer/SynthIQ-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sethuiyer__SynthIQ-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T20:56:49.534074](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__SynthIQ-7b/blob/main/results_2023-12-29T20-56-49.534074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6496837874821432,\n \"acc_stderr\": 0.031931649395756406,\n \"acc_norm\": 0.6512485789591291,\n \"acc_norm_stderr\": 0.03256964811222345,\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.570003601485072,\n \"mc2_stderr\": 0.015572869395398968\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893449,\n \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.013855831287497724\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6709818761202948,\n \"acc_stderr\": 0.004688963175758135,\n \"acc_norm\": 0.858195578570006,\n \"acc_norm_stderr\": 0.003481364840770978\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113728,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113728\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634325,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634325\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n \"acc_stderr\": 0.01653682964899711,\n \"acc_norm\": 0.42569832402234636,\n \"acc_norm_stderr\": 0.01653682964899711\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045708,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045708\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.02448448716291397,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.02448448716291397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.570003601485072,\n \"mc2_stderr\": 0.015572869395398968\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.01150895769072275\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.640636846095527,\n \"acc_stderr\": 0.013216456309851523\n }\n}\n```", "repo_url": "https://huggingface.co/sethuiyer/SynthIQ-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|arc:challenge|25_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|gsm8k|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hellaswag|10_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T20-56-49.534074.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["**/details_harness|winogrande|5_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T20-56-49.534074.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T20_56_49.534074", "path": ["results_2023-12-29T20-56-49.534074.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T20-56-49.534074.parquet"]}]}]} | 2023-12-29T20:59:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sethuiyer/SynthIQ-7b
Dataset automatically created during the evaluation run of model sethuiyer/SynthIQ-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T20:56:49.534074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sethuiyer/SynthIQ-7b\n\n\n\nDataset automatically created during the evaluation run of model sethuiyer/SynthIQ-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T20:56:49.534074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sethuiyer/SynthIQ-7b\n\n\n\nDataset automatically created during the evaluation run of model sethuiyer/SynthIQ-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T20:56:49.534074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of sethuiyer/SynthIQ-7b\n\n\n\nDataset automatically created during the evaluation run of model sethuiyer/SynthIQ-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T20:56:49.534074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
f1c45258a4de03172fc70d454985420302a97690 |
# Dataset Card for Evaluation run of Mihaiii/Metis-0.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/Metis-0.5](https://huggingface.co/Mihaiii/Metis-0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__Metis-0.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T21:03:34.268283](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Metis-0.5/blob/main/results_2023-12-29T21-03-34.268283.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6205495686116762,
"acc_stderr": 0.03278601697551399,
"acc_norm": 0.6253153124790392,
"acc_norm_stderr": 0.033438294991220995,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.4932590097591299,
"mc2_stderr": 0.015440588307546098
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522082,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759093
},
"harness|hellaswag|10": {
"acc": 0.6546504680342561,
"acc_stderr": 0.004745103543901293,
"acc_norm": 0.8376817367058355,
"acc_norm_stderr": 0.003679889125399814
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894444,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.02499305339776483,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.02499305339776483
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915434,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229153,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229153
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531772,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531772
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.02537213967172293,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.02537213967172293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407003,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407003
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.01649540063582008,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.01649540063582008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824087,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824087
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493272,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493272
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.019450768432505514,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.019450768432505514
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249765,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249765
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.4932590097591299,
"mc2_stderr": 0.015440588307546098
},
"harness|winogrande|5": {
"acc": 0.7513812154696132,
"acc_stderr": 0.012147314713403107
},
"harness|gsm8k|5": {
"acc": 0.4291129643669447,
"acc_stderr": 0.013633369425647244
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Mihaiii__Metis-0.5 | [
"region:us"
] | 2023-12-29T21:05:50+00:00 | {"pretty_name": "Evaluation run of Mihaiii/Metis-0.5", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mihaiii/Metis-0.5](https://huggingface.co/Mihaiii/Metis-0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Metis-0.5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T21:03:34.268283](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Metis-0.5/blob/main/results_2023-12-29T21-03-34.268283.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6205495686116762,\n \"acc_stderr\": 0.03278601697551399,\n \"acc_norm\": 0.6253153124790392,\n \"acc_norm_stderr\": 0.033438294991220995,\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.4932590097591299,\n \"mc2_stderr\": 0.015440588307546098\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522082,\n \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759093\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6546504680342561,\n \"acc_stderr\": 0.004745103543901293,\n \"acc_norm\": 0.8376817367058355,\n \"acc_norm_stderr\": 0.003679889125399814\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894444,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.02499305339776483,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.02499305339776483\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915434,\n \"acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229153,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229153\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531772,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531772\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n \"acc_stderr\": 0.02537213967172293,\n \"acc_norm\": 0.8162393162393162,\n \"acc_norm_stderr\": 0.02537213967172293\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294407003,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294407003\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n \"acc_stderr\": 0.01649540063582008,\n \"acc_norm\": 0.41787709497206704,\n \"acc_norm_stderr\": 0.01649540063582008\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824087,\n \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824087\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n \"acc_stderr\": 0.012673969883493272,\n \"acc_norm\": 0.438722294654498,\n \"acc_norm_stderr\": 0.012673969883493272\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.019450768432505514,\n \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.019450768432505514\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249765,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249765\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.4932590097591299,\n \"mc2_stderr\": 0.015440588307546098\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403107\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4291129643669447,\n \"acc_stderr\": 0.013633369425647244\n }\n}\n```", "repo_url": "https://huggingface.co/Mihaiii/Metis-0.5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|arc:challenge|25_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|gsm8k|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hellaswag|10_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T21-03-34.268283.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["**/details_harness|winogrande|5_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T21-03-34.268283.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T21_03_34.268283", "path": ["results_2023-12-29T21-03-34.268283.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T21-03-34.268283.parquet"]}]}]} | 2023-12-29T21:06:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Mihaiii/Metis-0.5
Dataset automatically created during the evaluation run of model Mihaiii/Metis-0.5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T21:03:34.268283(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Mihaiii/Metis-0.5\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Metis-0.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T21:03:34.268283(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Mihaiii/Metis-0.5\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Metis-0.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T21:03:34.268283(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
175,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mihaiii/Metis-0.5\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Metis-0.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T21:03:34.268283(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
3180b89b95900a3ffcfa6a6dd955dfdd8ae349b6 |
# Dataset Card for Evaluation run of aqweteddy/mistral_tv-neural-marconroni
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aqweteddy/mistral_tv-neural-marconroni](https://huggingface.co/aqweteddy/mistral_tv-neural-marconroni) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aqweteddy__mistral_tv-neural-marconroni",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T21:06:58.547736](https://huggingface.co/datasets/open-llm-leaderboard/details_aqweteddy__mistral_tv-neural-marconroni/blob/main/results_2023-12-29T21-06-58.547736.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.653997201380612,
"acc_stderr": 0.031948482529961096,
"acc_norm": 0.6549938668135185,
"acc_norm_stderr": 0.032596331692297566,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6002963889764419,
"mc2_stderr": 0.015327998641933535
},
"harness|arc:challenge|25": {
"acc": 0.6604095563139932,
"acc_stderr": 0.013839039762820167,
"acc_norm": 0.6919795221843004,
"acc_norm_stderr": 0.013491429517292038
},
"harness|hellaswag|10": {
"acc": 0.6724756024696276,
"acc_stderr": 0.004683511716552242,
"acc_norm": 0.8625771758613822,
"acc_norm_stderr": 0.003435895386692258
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.02350757902064536,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.02350757902064536
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829194,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829194
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553353,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553353
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4100558659217877,
"acc_stderr": 0.016449708209026078,
"acc_norm": 0.4100558659217877,
"acc_norm_stderr": 0.016449708209026078
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.012740853872949834,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.012740853872949834
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352818,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352818
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6002963889764419,
"mc2_stderr": 0.015327998641933535
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510425
},
"harness|gsm8k|5": {
"acc": 0.6618650492797574,
"acc_stderr": 0.013030829145172217
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_aqweteddy__mistral_tv-neural-marconroni | [
"region:us"
] | 2023-12-29T21:09:16+00:00 | {"pretty_name": "Evaluation run of aqweteddy/mistral_tv-neural-marconroni", "dataset_summary": "Dataset automatically created during the evaluation run of model [aqweteddy/mistral_tv-neural-marconroni](https://huggingface.co/aqweteddy/mistral_tv-neural-marconroni) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aqweteddy__mistral_tv-neural-marconroni\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T21:06:58.547736](https://huggingface.co/datasets/open-llm-leaderboard/details_aqweteddy__mistral_tv-neural-marconroni/blob/main/results_2023-12-29T21-06-58.547736.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.653997201380612,\n \"acc_stderr\": 0.031948482529961096,\n \"acc_norm\": 0.6549938668135185,\n \"acc_norm_stderr\": 0.032596331692297566,\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6002963889764419,\n \"mc2_stderr\": 0.015327998641933535\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6604095563139932,\n \"acc_stderr\": 0.013839039762820167,\n \"acc_norm\": 0.6919795221843004,\n \"acc_norm_stderr\": 0.013491429517292038\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6724756024696276,\n \"acc_stderr\": 0.004683511716552242,\n \"acc_norm\": 0.8625771758613822,\n \"acc_norm_stderr\": 0.003435895386692258\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.02350757902064536,\n \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.02350757902064536\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265023,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265023\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4100558659217877,\n \"acc_stderr\": 0.016449708209026078,\n \"acc_norm\": 0.4100558659217877,\n \"acc_norm_stderr\": 0.016449708209026078\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.012740853872949834,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.012740853872949834\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352818,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352818\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6002963889764419,\n \"mc2_stderr\": 0.015327998641933535\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510425\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6618650492797574,\n \"acc_stderr\": 0.013030829145172217\n }\n}\n```", "repo_url": "https://huggingface.co/aqweteddy/mistral_tv-neural-marconroni", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|arc:challenge|25_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|gsm8k|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hellaswag|10_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T21-06-58.547736.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["**/details_harness|winogrande|5_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T21-06-58.547736.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T21_06_58.547736", "path": ["results_2023-12-29T21-06-58.547736.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T21-06-58.547736.parquet"]}]}]} | 2023-12-29T21:09:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of aqweteddy/mistral_tv-neural-marconroni
Dataset automatically created during the evaluation run of model aqweteddy/mistral_tv-neural-marconroni on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T21:06:58.547736(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of aqweteddy/mistral_tv-neural-marconroni\n\n\n\nDataset automatically created during the evaluation run of model aqweteddy/mistral_tv-neural-marconroni on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T21:06:58.547736(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of aqweteddy/mistral_tv-neural-marconroni\n\n\n\nDataset automatically created during the evaluation run of model aqweteddy/mistral_tv-neural-marconroni on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T21:06:58.547736(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aqweteddy/mistral_tv-neural-marconroni\n\n\n\nDataset automatically created during the evaluation run of model aqweteddy/mistral_tv-neural-marconroni on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T21:06:58.547736(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
27583939d2f64f7dec5fb1c753a07baa770586f1 |
# Dataset Card for Evaluation run of Azazelle/Silicon-Medley
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azazelle/Silicon-Medley](https://huggingface.co/Azazelle/Silicon-Medley) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azazelle__Silicon-Medley",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T21:15:44.527913](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Silicon-Medley/blob/main/results_2023-12-29T21-15-44.527913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6470281740682771,
"acc_stderr": 0.03207701729050784,
"acc_norm": 0.6492542140625898,
"acc_norm_stderr": 0.03271560144589969,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.01737452048251371,
"mc2": 0.6134363868702684,
"mc2_stderr": 0.015490607347297604
},
"harness|arc:challenge|25": {
"acc": 0.64419795221843,
"acc_stderr": 0.013990571137918762,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6767576180043816,
"acc_stderr": 0.004667585072717506,
"acc_norm": 0.8620792670782712,
"acc_norm_stderr": 0.0034411206110598344
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718871,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718871
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970572,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970572
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978082,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978082
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.03409386946992699,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.03409386946992699
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940876,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940876
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464073,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464073
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297117,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897226,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.027365861131513812,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.027365861131513812
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223974,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223974
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.01737452048251371,
"mc2": 0.6134363868702684,
"mc2_stderr": 0.015490607347297604
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.01139859341938678
},
"harness|gsm8k|5": {
"acc": 0.5837755875663382,
"acc_stderr": 0.013577788334652658
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Azazelle__Silicon-Medley | [
"region:us"
] | 2023-12-29T21:18:02+00:00 | {"pretty_name": "Evaluation run of Azazelle/Silicon-Medley", "dataset_summary": "Dataset automatically created during the evaluation run of model [Azazelle/Silicon-Medley](https://huggingface.co/Azazelle/Silicon-Medley) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__Silicon-Medley\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T21:15:44.527913](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Silicon-Medley/blob/main/results_2023-12-29T21-15-44.527913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6470281740682771,\n \"acc_stderr\": 0.03207701729050784,\n \"acc_norm\": 0.6492542140625898,\n \"acc_norm_stderr\": 0.03271560144589969,\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.01737452048251371,\n \"mc2\": 0.6134363868702684,\n \"mc2_stderr\": 0.015490607347297604\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.013990571137918762,\n \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6767576180043816,\n \"acc_stderr\": 0.004667585072717506,\n \"acc_norm\": 0.8620792670782712,\n \"acc_norm_stderr\": 0.0034411206110598344\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718871,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718871\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970572,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970572\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978082,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978082\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.03409386946992699,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.03409386946992699\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.038498560987940876,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940876\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464073,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464073\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897226,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897226\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.027365861131513812,\n \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.027365861131513812\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223974,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223974\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.01737452048251371,\n \"mc2\": 0.6134363868702684,\n \"mc2_stderr\": 0.015490607347297604\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.01139859341938678\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5837755875663382,\n \"acc_stderr\": 0.013577788334652658\n }\n}\n```", "repo_url": "https://huggingface.co/Azazelle/Silicon-Medley", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|arc:challenge|25_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|gsm8k|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hellaswag|10_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T21-15-44.527913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["**/details_harness|winogrande|5_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T21-15-44.527913.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T21_15_44.527913", "path": ["results_2023-12-29T21-15-44.527913.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T21-15-44.527913.parquet"]}]}]} | 2023-12-29T21:18:26+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Azazelle/Silicon-Medley
Dataset automatically created during the evaluation run of model Azazelle/Silicon-Medley on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T21:15:44.527913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Azazelle/Silicon-Medley\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Silicon-Medley on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T21:15:44.527913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Azazelle/Silicon-Medley\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Silicon-Medley on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T21:15:44.527913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Azazelle/Silicon-Medley\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Silicon-Medley on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T21:15:44.527913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
172d93f65cadc56d4683c37dd181fbd57f16a34e |
A list of manufacturers inside Iran and outside Iran | dadashzadeh/manufacturers | [
"language:fa",
"license:mit",
"region:us"
] | 2023-12-29T21:18:30+00:00 | {"language": ["fa"], "license": "mit"} | 2023-12-29T21:27:34+00:00 | [] | [
"fa"
] | TAGS
#language-Persian #license-mit #region-us
|
A list of manufacturers inside Iran and outside Iran | [] | [
"TAGS\n#language-Persian #license-mit #region-us \n"
] | [
16
] | [
"passage: TAGS\n#language-Persian #license-mit #region-us \n"
] |
f738dd82ad7f1f56c25fa44d17b727f244944e40 |
# Dataset Card for Evaluation run of aloobun/bun_mistral_7b_v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aloobun/bun_mistral_7b_v2](https://huggingface.co/aloobun/bun_mistral_7b_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aloobun__bun_mistral_7b_v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T21:43:11.868828](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__bun_mistral_7b_v2/blob/main/results_2023-12-29T21-43-11.868828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6156047359789459,
"acc_stderr": 0.03249111009517131,
"acc_norm": 0.6209297452635882,
"acc_norm_stderr": 0.03315335422122162,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.40666362595991745,
"mc2_stderr": 0.01440530497666933
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870655,
"acc_norm": 0.5989761092150171,
"acc_norm_stderr": 0.014322255790719869
},
"harness|hellaswag|10": {
"acc": 0.6362278430591516,
"acc_stderr": 0.00480100965769044,
"acc_norm": 0.8265285799641505,
"acc_norm_stderr": 0.0037788044746059103
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.04113914981189261,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.04113914981189261
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878937,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878937
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606647,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606647
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834832,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257803,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23128491620111732,
"acc_stderr": 0.014102223623152573,
"acc_norm": 0.23128491620111732,
"acc_norm_stderr": 0.014102223623152573
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799215,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799215
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.012700582404768223,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.012700582404768223
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553704,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.40666362595991745,
"mc2_stderr": 0.01440530497666933
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209404
},
"harness|gsm8k|5": {
"acc": 0.3525398028809704,
"acc_stderr": 0.013159909755930317
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_aloobun__bun_mistral_7b_v2 | [
"region:us"
] | 2023-12-29T21:45:29+00:00 | {"pretty_name": "Evaluation run of aloobun/bun_mistral_7b_v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [aloobun/bun_mistral_7b_v2](https://huggingface.co/aloobun/bun_mistral_7b_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aloobun__bun_mistral_7b_v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T21:43:11.868828](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__bun_mistral_7b_v2/blob/main/results_2023-12-29T21-43-11.868828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6156047359789459,\n \"acc_stderr\": 0.03249111009517131,\n \"acc_norm\": 0.6209297452635882,\n \"acc_norm_stderr\": 0.03315335422122162,\n \"mc1\": 0.27539779681762544,\n \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.40666362595991745,\n \"mc2_stderr\": 0.01440530497666933\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870655,\n \"acc_norm\": 0.5989761092150171,\n \"acc_norm_stderr\": 0.014322255790719869\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6362278430591516,\n \"acc_stderr\": 0.00480100965769044,\n \"acc_norm\": 0.8265285799641505,\n \"acc_norm_stderr\": 0.0037788044746059103\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.04113914981189261,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.04113914981189261\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073332,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834832,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834832\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257803,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257803\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23128491620111732,\n \"acc_stderr\": 0.014102223623152573,\n \"acc_norm\": 0.23128491620111732,\n \"acc_norm_stderr\": 0.014102223623152573\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799215,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799215\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n \"acc_stderr\": 0.012700582404768223,\n \"acc_norm\": 0.44784876140808344,\n \"acc_norm_stderr\": 0.012700582404768223\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553704,\n \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553704\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.40666362595991745,\n \"mc2_stderr\": 0.01440530497666933\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209404\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3525398028809704,\n \"acc_stderr\": 0.013159909755930317\n }\n}\n```", "repo_url": "https://huggingface.co/aloobun/bun_mistral_7b_v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|arc:challenge|25_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|gsm8k|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hellaswag|10_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T21-43-11.868828.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["**/details_harness|winogrande|5_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T21-43-11.868828.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T21_43_11.868828", "path": ["results_2023-12-29T21-43-11.868828.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T21-43-11.868828.parquet"]}]}]} | 2023-12-29T21:45:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of aloobun/bun_mistral_7b_v2
Dataset automatically created during the evaluation run of model aloobun/bun_mistral_7b_v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T21:43:11.868828(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of aloobun/bun_mistral_7b_v2\n\n\n\nDataset automatically created during the evaluation run of model aloobun/bun_mistral_7b_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T21:43:11.868828(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of aloobun/bun_mistral_7b_v2\n\n\n\nDataset automatically created during the evaluation run of model aloobun/bun_mistral_7b_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T21:43:11.868828(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of aloobun/bun_mistral_7b_v2\n\n\n\nDataset automatically created during the evaluation run of model aloobun/bun_mistral_7b_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T21:43:11.868828(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
09d0855fc6a003430908d2df2167bf6fe08ba7a2 |
# Dataset Card for Evaluation run of seungduk/KoSOLAR-10.7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [seungduk/KoSOLAR-10.7B-v0.1](https://huggingface.co/seungduk/KoSOLAR-10.7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_seungduk__KoSOLAR-10.7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T21:44:43.738734](https://huggingface.co/datasets/open-llm-leaderboard/details_seungduk__KoSOLAR-10.7B-v0.1/blob/main/results_2023-12-29T21-44-43.738734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6557296612764617,
"acc_stderr": 0.03164789932062894,
"acc_norm": 0.6581869158001639,
"acc_norm_stderr": 0.032289473377177816,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4502592960541729,
"mc2_stderr": 0.014224680831862879
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225405,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6535550687114121,
"acc_stderr": 0.00474864513328157,
"acc_norm": 0.8454491137223661,
"acc_norm_stderr": 0.0036073726062951024
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554963,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603627,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223154,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633507,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633507
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955286,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250454,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250454
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.02462156286676842,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.02462156286676842
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508773,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508773
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261445,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261445
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.023576881744005716,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.023576881744005716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4980443285528031,
"acc_stderr": 0.012770138422208628,
"acc_norm": 0.4980443285528031,
"acc_norm_stderr": 0.012770138422208628
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103135,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103135
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.018492596536396955,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.018492596536396955
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.0267114305555384,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.0267114305555384
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.028782108105401705,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.028782108105401705
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4502592960541729,
"mc2_stderr": 0.014224680831862879
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222789
},
"harness|gsm8k|5": {
"acc": 0.5549658832448825,
"acc_stderr": 0.0136890115674142
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_seungduk__KoSOLAR-10.7B-v0.1 | [
"region:us"
] | 2023-12-29T21:46:58+00:00 | {"pretty_name": "Evaluation run of seungduk/KoSOLAR-10.7B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [seungduk/KoSOLAR-10.7B-v0.1](https://huggingface.co/seungduk/KoSOLAR-10.7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_seungduk__KoSOLAR-10.7B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T21:44:43.738734](https://huggingface.co/datasets/open-llm-leaderboard/details_seungduk__KoSOLAR-10.7B-v0.1/blob/main/results_2023-12-29T21-44-43.738734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6557296612764617,\n \"acc_stderr\": 0.03164789932062894,\n \"acc_norm\": 0.6581869158001639,\n \"acc_norm_stderr\": 0.032289473377177816,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4502592960541729,\n \"mc2_stderr\": 0.014224680831862879\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225405,\n \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6535550687114121,\n \"acc_stderr\": 0.00474864513328157,\n \"acc_norm\": 0.8454491137223661,\n \"acc_norm_stderr\": 0.0036073726062951024\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554963,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603627,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603627\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223154,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633507,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633507\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955286,\n \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955286\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8270042194092827,\n \"acc_stderr\": 0.02462156286676842,\n \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.02462156286676842\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.030216831011508773,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.030216831011508773\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265023,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265023\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n \"acc_stderr\": 0.014444157808261445,\n \"acc_norm\": 0.24804469273743016,\n \"acc_norm_stderr\": 0.014444157808261445\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.02342037547829613,\n \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.02342037547829613\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.023576881744005716,\n \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.023576881744005716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4980443285528031,\n \"acc_stderr\": 0.012770138422208628,\n \"acc_norm\": 0.4980443285528031,\n \"acc_norm_stderr\": 0.012770138422208628\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103135,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103135\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.018492596536396955,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.018492596536396955\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.0267114305555384,\n \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.0267114305555384\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.028782108105401705,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.028782108105401705\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4502592960541729,\n \"mc2_stderr\": 0.014224680831862879\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222789\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5549658832448825,\n \"acc_stderr\": 0.0136890115674142\n }\n}\n```", "repo_url": "https://huggingface.co/seungduk/KoSOLAR-10.7B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|arc:challenge|25_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|gsm8k|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hellaswag|10_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T21-44-43.738734.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["**/details_harness|winogrande|5_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T21-44-43.738734.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T21_44_43.738734", "path": ["results_2023-12-29T21-44-43.738734.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T21-44-43.738734.parquet"]}]}]} | 2023-12-29T21:47:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of seungduk/KoSOLAR-10.7B-v0.1
Dataset automatically created during the evaluation run of model seungduk/KoSOLAR-10.7B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T21:44:43.738734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of seungduk/KoSOLAR-10.7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model seungduk/KoSOLAR-10.7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T21:44:43.738734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of seungduk/KoSOLAR-10.7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model seungduk/KoSOLAR-10.7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T21:44:43.738734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of seungduk/KoSOLAR-10.7B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model seungduk/KoSOLAR-10.7B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T21:44:43.738734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
8b8e5bd49eb9c5c65fb890e0ae9cc55df73b21d4 |
# Dataset Card for VECHR
### Dataset Summary
[VECHR: A Dataset for Explainable and Robust Classification of Vulnerability Type in the European Court of Human Rights](https://aclanthology.org/2023.emnlp-main.718/)
Recognizing vulnerability is crucial for understanding and implementing targeted support to empower individuals in need. This is especially important at the European Court of Human Rights (ECtHR), where the court adapts Convention standards to meet actual individual needs and thus to ensure effective human rights protection. However, the concept of vulnerability remains elusive at the ECtHR and no prior NLP research has dealt with it. To enable future research in this area, we present VECHR, a novel expert-annotated multi-label dataset comprising of vulnerability type classification and explanation rationale. We benchmark the performance of state-of-the-art models on VECHR from both prediction and explainability perspective. Our results demonstrate the challenging nature of task with lower prediction performance and limited agreement between models and experts. Further, we analyze the robustness of these models in dealing with out-of-domain (OOD) data and observe overall limited performance. Our dataset poses unique challenges offering a significant room for improvement regarding performance, explainability and robustness.
### Languages
English
# Citation Information
@inproceedings{xu-etal-2023-vechr,
title = "{VECHR}: A Dataset for Explainable and Robust Classification of Vulnerability Type in the {E}uropean Court of Human Rights",
author = "Xu, Shanshan and
Staufer, Leon and
T.y.s.s, Santosh and
Ichim, Oana and
Heri, Corina and
Grabmair, Matthias",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-main.718",
doi = "10.18653/v1/2023.emnlp-main.718",
pages = "11738--11752",
}
| sxu/VECHR | [
"size_categories:n<1K",
"language:en",
"license:afl-3.0",
"legal",
"region:us"
] | 2023-12-29T22:14:21+00:00 | {"language": ["en"], "license": "afl-3.0", "size_categories": ["n<1K"], "tags": ["legal"]} | 2023-12-29T22:42:39+00:00 | [] | [
"en"
] | TAGS
#size_categories-n<1K #language-English #license-afl-3.0 #legal #region-us
|
# Dataset Card for VECHR
### Dataset Summary
VECHR: A Dataset for Explainable and Robust Classification of Vulnerability Type in the European Court of Human Rights
Recognizing vulnerability is crucial for understanding and implementing targeted support to empower individuals in need. This is especially important at the European Court of Human Rights (ECtHR), where the court adapts Convention standards to meet actual individual needs and thus to ensure effective human rights protection. However, the concept of vulnerability remains elusive at the ECtHR and no prior NLP research has dealt with it. To enable future research in this area, we present VECHR, a novel expert-annotated multi-label dataset comprising of vulnerability type classification and explanation rationale. We benchmark the performance of state-of-the-art models on VECHR from both prediction and explainability perspective. Our results demonstrate the challenging nature of task with lower prediction performance and limited agreement between models and experts. Further, we analyze the robustness of these models in dealing with out-of-domain (OOD) data and observe overall limited performance. Our dataset poses unique challenges offering a significant room for improvement regarding performance, explainability and robustness.
### Languages
English
@inproceedings{xu-etal-2023-vechr,
title = "{VECHR}: A Dataset for Explainable and Robust Classification of Vulnerability Type in the {E}uropean Court of Human Rights",
author = "Xu, Shanshan and
Staufer, Leon and
T.y.s.s, Santosh and
Ichim, Oana and
Heri, Corina and
Grabmair, Matthias",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "URL
doi = "10.18653/v1/URL-main.718",
pages = "11738--11752",
}
| [
"# Dataset Card for VECHR",
"### Dataset Summary\nVECHR: A Dataset for Explainable and Robust Classification of Vulnerability Type in the European Court of Human Rights\n\nRecognizing vulnerability is crucial for understanding and implementing targeted support to empower individuals in need. This is especially important at the European Court of Human Rights (ECtHR), where the court adapts Convention standards to meet actual individual needs and thus to ensure effective human rights protection. However, the concept of vulnerability remains elusive at the ECtHR and no prior NLP research has dealt with it. To enable future research in this area, we present VECHR, a novel expert-annotated multi-label dataset comprising of vulnerability type classification and explanation rationale. We benchmark the performance of state-of-the-art models on VECHR from both prediction and explainability perspective. Our results demonstrate the challenging nature of task with lower prediction performance and limited agreement between models and experts. Further, we analyze the robustness of these models in dealing with out-of-domain (OOD) data and observe overall limited performance. Our dataset poses unique challenges offering a significant room for improvement regarding performance, explainability and robustness.",
"### Languages\nEnglish\n\n\n\n @inproceedings{xu-etal-2023-vechr,\n title = \"{VECHR}: A Dataset for Explainable and Robust Classification of Vulnerability Type in the {E}uropean Court of Human Rights\",\n author = \"Xu, Shanshan and\n Staufer, Leon and\n T.y.s.s, Santosh and\n Ichim, Oana and\n Heri, Corina and\n Grabmair, Matthias\",\n booktitle = \"Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing\",\n month = dec,\n year = \"2023\",\n address = \"Singapore\",\n publisher = \"Association for Computational Linguistics\",\n url = \"URL\n doi = \"10.18653/v1/URL-main.718\",\n pages = \"11738--11752\",\n}"
] | [
"TAGS\n#size_categories-n<1K #language-English #license-afl-3.0 #legal #region-us \n",
"# Dataset Card for VECHR",
"### Dataset Summary\nVECHR: A Dataset for Explainable and Robust Classification of Vulnerability Type in the European Court of Human Rights\n\nRecognizing vulnerability is crucial for understanding and implementing targeted support to empower individuals in need. This is especially important at the European Court of Human Rights (ECtHR), where the court adapts Convention standards to meet actual individual needs and thus to ensure effective human rights protection. However, the concept of vulnerability remains elusive at the ECtHR and no prior NLP research has dealt with it. To enable future research in this area, we present VECHR, a novel expert-annotated multi-label dataset comprising of vulnerability type classification and explanation rationale. We benchmark the performance of state-of-the-art models on VECHR from both prediction and explainability perspective. Our results demonstrate the challenging nature of task with lower prediction performance and limited agreement between models and experts. Further, we analyze the robustness of these models in dealing with out-of-domain (OOD) data and observe overall limited performance. Our dataset poses unique challenges offering a significant room for improvement regarding performance, explainability and robustness.",
"### Languages\nEnglish\n\n\n\n @inproceedings{xu-etal-2023-vechr,\n title = \"{VECHR}: A Dataset for Explainable and Robust Classification of Vulnerability Type in the {E}uropean Court of Human Rights\",\n author = \"Xu, Shanshan and\n Staufer, Leon and\n T.y.s.s, Santosh and\n Ichim, Oana and\n Heri, Corina and\n Grabmair, Matthias\",\n booktitle = \"Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing\",\n month = dec,\n year = \"2023\",\n address = \"Singapore\",\n publisher = \"Association for Computational Linguistics\",\n url = \"URL\n doi = \"10.18653/v1/URL-main.718\",\n pages = \"11738--11752\",\n}"
] | [
30,
8,
264,
194
] | [
"passage: TAGS\n#size_categories-n<1K #language-English #license-afl-3.0 #legal #region-us \n# Dataset Card for VECHR### Dataset Summary\nVECHR: A Dataset for Explainable and Robust Classification of Vulnerability Type in the European Court of Human Rights\n\nRecognizing vulnerability is crucial for understanding and implementing targeted support to empower individuals in need. This is especially important at the European Court of Human Rights (ECtHR), where the court adapts Convention standards to meet actual individual needs and thus to ensure effective human rights protection. However, the concept of vulnerability remains elusive at the ECtHR and no prior NLP research has dealt with it. To enable future research in this area, we present VECHR, a novel expert-annotated multi-label dataset comprising of vulnerability type classification and explanation rationale. We benchmark the performance of state-of-the-art models on VECHR from both prediction and explainability perspective. Our results demonstrate the challenging nature of task with lower prediction performance and limited agreement between models and experts. Further, we analyze the robustness of these models in dealing with out-of-domain (OOD) data and observe overall limited performance. Our dataset poses unique challenges offering a significant room for improvement regarding performance, explainability and robustness.### Languages\nEnglish\n\n\n\n @inproceedings{xu-etal-2023-vechr,\n title = \"{VECHR}: A Dataset for Explainable and Robust Classification of Vulnerability Type in the {E}uropean Court of Human Rights\",\n author = \"Xu, Shanshan and\n Staufer, Leon and\n T.y.s.s, Santosh and\n Ichim, Oana and\n Heri, Corina and\n Grabmair, Matthias\",\n booktitle = \"Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing\",\n month = dec,\n year = \"2023\",\n address = \"Singapore\",\n publisher = \"Association for Computational Linguistics\",\n url = \"URL\n doi = \"10.18653/v1/URL-main.718\",\n pages = \"11738--11752\",\n}"
] |
9e731b8463a3f153ca41705a0bd9262eb1e3c7d7 |
# Dataset Card for Evaluation run of Walmart-the-bag/openchat-3.5-Infinity
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Walmart-the-bag/openchat-3.5-Infinity](https://huggingface.co/Walmart-the-bag/openchat-3.5-Infinity) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Walmart-the-bag__openchat-3.5-Infinity",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T22:24:05.513640](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__openchat-3.5-Infinity/blob/main/results_2023-12-29T22-24-05.513640.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6482331644164944,
"acc_stderr": 0.032166482561296374,
"acc_norm": 0.6494647991303745,
"acc_norm_stderr": 0.03282046947130963,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5198754304913998,
"mc2_stderr": 0.015470093705054921
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.014405618279436176,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.01413770860175909
},
"harness|hellaswag|10": {
"acc": 0.6629157538338977,
"acc_stderr": 0.00471747833568963,
"acc_norm": 0.8404700258912567,
"acc_norm_stderr": 0.0036542123295166145
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723306,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.02293514405391945,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.02293514405391945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.02394672474156397,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.02394672474156397
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381394,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468348,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.016175692013381957,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.016175692013381957
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422466,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422466
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.01275015180292244,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.01275015180292244
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7242647058823529,
"acc_stderr": 0.027146271936625166,
"acc_norm": 0.7242647058823529,
"acc_norm_stderr": 0.027146271936625166
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5198754304913998,
"mc2_stderr": 0.015470093705054921
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.011218629972515328
},
"harness|gsm8k|5": {
"acc": 0.6429112964366944,
"acc_stderr": 0.013197931775445206
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Walmart-the-bag__openchat-3.5-Infinity | [
"region:us"
] | 2023-12-29T22:26:24+00:00 | {"pretty_name": "Evaluation run of Walmart-the-bag/openchat-3.5-Infinity", "dataset_summary": "Dataset automatically created during the evaluation run of model [Walmart-the-bag/openchat-3.5-Infinity](https://huggingface.co/Walmart-the-bag/openchat-3.5-Infinity) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Walmart-the-bag__openchat-3.5-Infinity\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T22:24:05.513640](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__openchat-3.5-Infinity/blob/main/results_2023-12-29T22-24-05.513640.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6482331644164944,\n \"acc_stderr\": 0.032166482561296374,\n \"acc_norm\": 0.6494647991303745,\n \"acc_norm_stderr\": 0.03282046947130963,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5198754304913998,\n \"mc2_stderr\": 0.015470093705054921\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.014405618279436176,\n \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.01413770860175909\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6629157538338977,\n \"acc_stderr\": 0.00471747833568963,\n \"acc_norm\": 0.8404700258912567,\n \"acc_norm_stderr\": 0.0036542123295166145\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723306,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.02293514405391945,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.02293514405391945\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.02394672474156397,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.02394672474156397\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n \"acc_stderr\": 0.014036945850381394,\n \"acc_norm\": 0.80970625798212,\n \"acc_norm_stderr\": 0.014036945850381394\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468348,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468348\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n \"acc_stderr\": 0.016175692013381957,\n \"acc_norm\": 0.37318435754189944,\n \"acc_norm_stderr\": 0.016175692013381957\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422466,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422466\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n \"acc_stderr\": 0.01275015180292244,\n \"acc_norm\": 0.47196870925684486,\n \"acc_norm_stderr\": 0.01275015180292244\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625166,\n \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625166\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5198754304913998,\n \"mc2_stderr\": 0.015470093705054921\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515328\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6429112964366944,\n \"acc_stderr\": 0.013197931775445206\n }\n}\n```", "repo_url": "https://huggingface.co/Walmart-the-bag/openchat-3.5-Infinity", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-24-05.513640.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["**/details_harness|winogrande|5_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T22-24-05.513640.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T22_24_05.513640", "path": ["results_2023-12-29T22-24-05.513640.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T22-24-05.513640.parquet"]}]}]} | 2023-12-29T22:26:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Walmart-the-bag/openchat-3.5-Infinity
Dataset automatically created during the evaluation run of model Walmart-the-bag/openchat-3.5-Infinity on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T22:24:05.513640(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Walmart-the-bag/openchat-3.5-Infinity\n\n\n\nDataset automatically created during the evaluation run of model Walmart-the-bag/openchat-3.5-Infinity on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:24:05.513640(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Walmart-the-bag/openchat-3.5-Infinity\n\n\n\nDataset automatically created during the evaluation run of model Walmart-the-bag/openchat-3.5-Infinity on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:24:05.513640(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Walmart-the-bag/openchat-3.5-Infinity\n\n\n\nDataset automatically created during the evaluation run of model Walmart-the-bag/openchat-3.5-Infinity on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T22:24:05.513640(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
eab6ba168f5d18806f0aae9f95c4b0ef80f85bfd |
# Dataset Card for Evaluation run of Azazelle/Half-NSFW_Noromaid-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azazelle/Half-NSFW_Noromaid-7b](https://huggingface.co/Azazelle/Half-NSFW_Noromaid-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azazelle__Half-NSFW_Noromaid-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T22:29:37.489493](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Half-NSFW_Noromaid-7b/blob/main/results_2023-12-29T22-29-37.489493.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6352281324625195,
"acc_stderr": 0.03250526564624033,
"acc_norm": 0.6410398841251862,
"acc_norm_stderr": 0.033158652839663905,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323006,
"mc2": 0.46047173727413704,
"mc2_stderr": 0.01458373353420166
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472434,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.6483768173670583,
"acc_stderr": 0.004765012078929387,
"acc_norm": 0.8482374029077873,
"acc_norm_stderr": 0.003580573563373659
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404897,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404897
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.03086868260412163,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.03086868260412163
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431395,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431395
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709698,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709698
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316562,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316562
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.01421413855691391,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.01421413855691391
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247326,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247326
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.014716824273017761,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.014716824273017761
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657115,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657115
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824866,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824866
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700032,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700032
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323006,
"mc2": 0.46047173727413704,
"mc2_stderr": 0.01458373353420166
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.3843821076573162,
"acc_stderr": 0.013399219253698191
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Azazelle__Half-NSFW_Noromaid-7b | [
"region:us"
] | 2023-12-29T22:31:55+00:00 | {"pretty_name": "Evaluation run of Azazelle/Half-NSFW_Noromaid-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Azazelle/Half-NSFW_Noromaid-7b](https://huggingface.co/Azazelle/Half-NSFW_Noromaid-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__Half-NSFW_Noromaid-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T22:29:37.489493](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Half-NSFW_Noromaid-7b/blob/main/results_2023-12-29T22-29-37.489493.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6352281324625195,\n \"acc_stderr\": 0.03250526564624033,\n \"acc_norm\": 0.6410398841251862,\n \"acc_norm_stderr\": 0.033158652839663905,\n \"mc1\": 0.3072215422276622,\n \"mc1_stderr\": 0.016150201321323006,\n \"mc2\": 0.46047173727413704,\n \"mc2_stderr\": 0.01458373353420166\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472434,\n \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6483768173670583,\n \"acc_stderr\": 0.004765012078929387,\n \"acc_norm\": 0.8482374029077873,\n \"acc_norm_stderr\": 0.003580573563373659\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404897,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404897\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.03086868260412163,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.03086868260412163\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431395,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431395\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709698,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709698\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.02220930907316562,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.02220930907316562\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n \"acc_stderr\": 0.01421413855691391,\n \"acc_norm\": 0.8033205619412516,\n \"acc_norm_stderr\": 0.01421413855691391\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247326,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247326\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n \"acc_stderr\": 0.014716824273017761,\n \"acc_norm\": 0.26256983240223464,\n \"acc_norm_stderr\": 0.014716824273017761\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n \"acc_stderr\": 0.012691575792657115,\n \"acc_norm\": 0.4445893089960887,\n \"acc_norm_stderr\": 0.012691575792657115\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700032,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700032\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n \"mc1_stderr\": 0.016150201321323006,\n \"mc2\": 0.46047173727413704,\n \"mc2_stderr\": 0.01458373353420166\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3843821076573162,\n \"acc_stderr\": 0.013399219253698191\n }\n}\n```", "repo_url": "https://huggingface.co/Azazelle/Half-NSFW_Noromaid-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-29-37.489493.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["**/details_harness|winogrande|5_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T22-29-37.489493.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T22_29_37.489493", "path": ["results_2023-12-29T22-29-37.489493.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T22-29-37.489493.parquet"]}]}]} | 2023-12-29T22:32:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Azazelle/Half-NSFW_Noromaid-7b
Dataset automatically created during the evaluation run of model Azazelle/Half-NSFW_Noromaid-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T22:29:37.489493(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Azazelle/Half-NSFW_Noromaid-7b\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Half-NSFW_Noromaid-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:29:37.489493(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Azazelle/Half-NSFW_Noromaid-7b\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Half-NSFW_Noromaid-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:29:37.489493(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Azazelle/Half-NSFW_Noromaid-7b\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Half-NSFW_Noromaid-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T22:29:37.489493(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
ab5f174822ddda89d9cf850842493bcea4c7547c |
# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kyujinpy/Sakura-SOLAR-Instruct-DPO-v1](https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct-DPO-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T22:30:23.063100](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v1/blob/main/results_2023-12-29T22-30-23.063100.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v1 | [
"region:us"
] | 2023-12-29T22:32:38+00:00 | {"pretty_name": "Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [kyujinpy/Sakura-SOLAR-Instruct-DPO-v1](https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct-DPO-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T22:30:23.063100](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v1/blob/main/results_2023-12-29T22-30-23.063100.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct-DPO-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-30-23.063100.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["**/details_harness|winogrande|5_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T22-30-23.063100.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T22_30_23.063100", "path": ["results_2023-12-29T22-30-23.063100.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T22-30-23.063100.parquet"]}]}]} | 2023-12-29T22:33:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v1
Dataset automatically created during the evaluation run of model kyujinpy/Sakura-SOLAR-Instruct-DPO-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T22:30:23.063100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v1\n\n\n\nDataset automatically created during the evaluation run of model kyujinpy/Sakura-SOLAR-Instruct-DPO-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:30:23.063100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v1\n\n\n\nDataset automatically created during the evaluation run of model kyujinpy/Sakura-SOLAR-Instruct-DPO-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:30:23.063100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v1\n\n\n\nDataset automatically created during the evaluation run of model kyujinpy/Sakura-SOLAR-Instruct-DPO-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T22:30:23.063100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
ffdfbd1a7e11205b8bbd82d5565fd945eb12ec24 |
# Dataset Card for Evaluation run of diffnamehard/Mistral-CatMacaroni-slerp-gradient
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [diffnamehard/Mistral-CatMacaroni-slerp-gradient](https://huggingface.co/diffnamehard/Mistral-CatMacaroni-slerp-gradient) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_diffnamehard__Mistral-CatMacaroni-slerp-gradient",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T22:31:36.710491](https://huggingface.co/datasets/open-llm-leaderboard/details_diffnamehard__Mistral-CatMacaroni-slerp-gradient/blob/main/results_2023-12-29T22-31-36.710491.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6184925191759689,
"acc_stderr": 0.03292453924445066,
"acc_norm": 0.6200377010599634,
"acc_norm_stderr": 0.03359389853951756,
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075363,
"mc2": 0.6409730836798082,
"mc2_stderr": 0.015085209023857519
},
"harness|arc:challenge|25": {
"acc": 0.6228668941979523,
"acc_stderr": 0.014163366896192601,
"acc_norm": 0.6552901023890785,
"acc_norm_stderr": 0.01388881628678211
},
"harness|hellaswag|10": {
"acc": 0.658334993029277,
"acc_stderr": 0.004732986187325881,
"acc_norm": 0.856602270464051,
"acc_norm_stderr": 0.0034976171082184006
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467383,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467383
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317213,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317213
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.024784316942156395,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.024784316942156395
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266854,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266854
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.038448761397852714,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.038448761397852714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.69,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257796,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257796
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.016578997435496717,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.016578997435496717
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818774,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.02532988817190093,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.02532988817190093
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223685,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669966,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669966
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4079601990049751,
"acc_stderr": 0.034751163651940926,
"acc_norm": 0.4079601990049751,
"acc_norm_stderr": 0.034751163651940926
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075363,
"mc2": 0.6409730836798082,
"mc2_stderr": 0.015085209023857519
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625842
},
"harness|gsm8k|5": {
"acc": 0.576194086429113,
"acc_stderr": 0.013611632008810357
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_diffnamehard__Mistral-CatMacaroni-slerp-gradient | [
"region:us"
] | 2023-12-29T22:34:03+00:00 | {"pretty_name": "Evaluation run of diffnamehard/Mistral-CatMacaroni-slerp-gradient", "dataset_summary": "Dataset automatically created during the evaluation run of model [diffnamehard/Mistral-CatMacaroni-slerp-gradient](https://huggingface.co/diffnamehard/Mistral-CatMacaroni-slerp-gradient) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_diffnamehard__Mistral-CatMacaroni-slerp-gradient\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T22:31:36.710491](https://huggingface.co/datasets/open-llm-leaderboard/details_diffnamehard__Mistral-CatMacaroni-slerp-gradient/blob/main/results_2023-12-29T22-31-36.710491.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6184925191759689,\n \"acc_stderr\": 0.03292453924445066,\n \"acc_norm\": 0.6200377010599634,\n \"acc_norm_stderr\": 0.03359389853951756,\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.017492470843075363,\n \"mc2\": 0.6409730836798082,\n \"mc2_stderr\": 0.015085209023857519\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6228668941979523,\n \"acc_stderr\": 0.014163366896192601,\n \"acc_norm\": 0.6552901023890785,\n \"acc_norm_stderr\": 0.01388881628678211\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.658334993029277,\n \"acc_stderr\": 0.004732986187325881,\n \"acc_norm\": 0.856602270464051,\n \"acc_norm_stderr\": 0.0034976171082184006\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467383,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467383\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n \"acc_stderr\": 0.028229497320317213,\n \"acc_norm\": 0.5612903225806452,\n \"acc_norm_stderr\": 0.028229497320317213\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.024784316942156395,\n \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.024784316942156395\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266854,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266854\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.038448761397852714,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.038448761397852714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257796,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257796\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n \"acc_stderr\": 0.016578997435496717,\n \"acc_norm\": 0.4346368715083799,\n \"acc_norm_stderr\": 0.016578997435496717\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818774,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818774\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.02532988817190093,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.02532988817190093\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223685,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n \"acc_stderr\": 0.012728446067669966,\n \"acc_norm\": 0.4595827900912647,\n \"acc_norm_stderr\": 0.012728446067669966\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4079601990049751,\n \"acc_stderr\": 0.034751163651940926,\n \"acc_norm\": 0.4079601990049751,\n \"acc_norm_stderr\": 0.034751163651940926\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774707,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774707\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.017492470843075363,\n \"mc2\": 0.6409730836798082,\n \"mc2_stderr\": 0.015085209023857519\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625842\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.576194086429113,\n \"acc_stderr\": 0.013611632008810357\n }\n}\n```", "repo_url": "https://huggingface.co/diffnamehard/Mistral-CatMacaroni-slerp-gradient", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-31-36.710491.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["**/details_harness|winogrande|5_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T22-31-36.710491.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T22_31_36.710491", "path": ["results_2023-12-29T22-31-36.710491.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T22-31-36.710491.parquet"]}]}]} | 2023-12-29T22:34:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of diffnamehard/Mistral-CatMacaroni-slerp-gradient
Dataset automatically created during the evaluation run of model diffnamehard/Mistral-CatMacaroni-slerp-gradient on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T22:31:36.710491(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of diffnamehard/Mistral-CatMacaroni-slerp-gradient\n\n\n\nDataset automatically created during the evaluation run of model diffnamehard/Mistral-CatMacaroni-slerp-gradient on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:31:36.710491(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of diffnamehard/Mistral-CatMacaroni-slerp-gradient\n\n\n\nDataset automatically created during the evaluation run of model diffnamehard/Mistral-CatMacaroni-slerp-gradient on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:31:36.710491(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of diffnamehard/Mistral-CatMacaroni-slerp-gradient\n\n\n\nDataset automatically created during the evaluation run of model diffnamehard/Mistral-CatMacaroni-slerp-gradient on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T22:31:36.710491(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
c0360486a687ebc64f450e9052163487193ceeb8 |
# Dataset Card for Evaluation run of ai-forever/mGPT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ai-forever/mGPT](https://huggingface.co/ai-forever/mGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ai-forever__mGPT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T22:35:26.065619](https://huggingface.co/datasets/open-llm-leaderboard/details_ai-forever__mGPT/blob/main/results_2023-12-29T22-35-26.065619.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25087298718135403,
"acc_stderr": 0.030708943553259822,
"acc_norm": 0.2516614092703414,
"acc_norm_stderr": 0.03152915127982111,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.39616632351350106,
"mc2_stderr": 0.01482764797162176
},
"harness|arc:challenge|25": {
"acc": 0.1885665529010239,
"acc_stderr": 0.011430897647675806,
"acc_norm": 0.2380546075085324,
"acc_norm_stderr": 0.0124457700280262
},
"harness|hellaswag|10": {
"acc": 0.261700856403107,
"acc_stderr": 0.004386622589119071,
"acc_norm": 0.2636924915355507,
"acc_norm_stderr": 0.004397339661695466
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073462,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073462
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.0301675334686327,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.0301675334686327
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03745554791462458,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03745554791462458
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173044,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173044
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.02767845257821238,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.02767845257821238
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893596,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.020742740560122652,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.020742740560122652
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233483,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233483
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29015544041450775,
"acc_stderr": 0.03275264467791515,
"acc_norm": 0.29015544041450775,
"acc_norm_stderr": 0.03275264467791515
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35128205128205126,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.35128205128205126,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.025787874220959316,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.025787874220959316
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3522935779816514,
"acc_stderr": 0.020480568843998997,
"acc_norm": 0.3522935779816514,
"acc_norm_stderr": 0.020480568843998997
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.1031390134529148,
"acc_stderr": 0.020412564289839272,
"acc_norm": 0.1031390134529148,
"acc_norm_stderr": 0.020412564289839272
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3053435114503817,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.3053435114503817,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.027778835904935437,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.027778835904935437
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.015517322365529619,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.015517322365529619
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925992,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.024288619466046105,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.024288619466046105
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20679012345679013,
"acc_stderr": 0.022535006705942825,
"acc_norm": 0.20679012345679013,
"acc_norm_stderr": 0.022535006705942825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843014,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843014
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.02406059942348743,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.02406059942348743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250068,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.029475250236017183,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.029475250236017183
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.03696584317010601,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.03696584317010601
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080508,
"mc2": 0.39616632351350106,
"mc2_stderr": 0.01482764797162176
},
"harness|winogrande|5": {
"acc": 0.5067087608524072,
"acc_stderr": 0.014051220692330349
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ai-forever__mGPT | [
"region:us"
] | 2023-12-29T22:37:14+00:00 | {"pretty_name": "Evaluation run of ai-forever/mGPT", "dataset_summary": "Dataset automatically created during the evaluation run of model [ai-forever/mGPT](https://huggingface.co/ai-forever/mGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ai-forever__mGPT\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T22:35:26.065619](https://huggingface.co/datasets/open-llm-leaderboard/details_ai-forever__mGPT/blob/main/results_2023-12-29T22-35-26.065619.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25087298718135403,\n \"acc_stderr\": 0.030708943553259822,\n \"acc_norm\": 0.2516614092703414,\n \"acc_norm_stderr\": 0.03152915127982111,\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.39616632351350106,\n \"mc2_stderr\": 0.01482764797162176\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.1885665529010239,\n \"acc_stderr\": 0.011430897647675806,\n \"acc_norm\": 0.2380546075085324,\n \"acc_norm_stderr\": 0.0124457700280262\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.261700856403107,\n \"acc_stderr\": 0.004386622589119071,\n \"acc_norm\": 0.2636924915355507,\n \"acc_norm_stderr\": 0.004397339661695466\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073462,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073462\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.0301675334686327,\n \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.0301675334686327\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.03745554791462458,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03745554791462458\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.03126511206173044,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.03126511206173044\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.02767845257821238,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.02767845257821238\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893596,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2037037037037037,\n \"acc_stderr\": 0.020742740560122652,\n \"acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.020742740560122652\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233483,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233483\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.03275264467791515,\n \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.03275264467791515\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.35128205128205126,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.35128205128205126,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.025787874220959316,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.025787874220959316\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3522935779816514,\n \"acc_stderr\": 0.020480568843998997,\n \"acc_norm\": 0.3522935779816514,\n \"acc_norm_stderr\": 0.020480568843998997\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.1031390134529148,\n \"acc_stderr\": 0.020412564289839272,\n \"acc_norm\": 0.1031390134529148,\n \"acc_norm_stderr\": 0.020412564289839272\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3053435114503817,\n \"acc_stderr\": 0.04039314978724561,\n \"acc_norm\": 0.3053435114503817,\n \"acc_norm_stderr\": 0.04039314978724561\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.0458212416016155,\n \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.0458212416016155\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n \"acc_stderr\": 0.027778835904935437,\n \"acc_norm\": 0.23504273504273504,\n \"acc_norm_stderr\": 0.027778835904935437\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n \"acc_stderr\": 0.015517322365529619,\n \"acc_norm\": 0.2515964240102171,\n \"acc_norm_stderr\": 0.015517322365529619\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925992,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925992\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.024288619466046105,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.024288619466046105\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.2797427652733119,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.20679012345679013,\n \"acc_stderr\": 0.022535006705942825,\n \"acc_norm\": 0.20679012345679013,\n \"acc_norm_stderr\": 0.022535006705942825\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843014,\n \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843014\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.02406059942348743,\n \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.02406059942348743\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250068,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n \"acc_stderr\": 0.029475250236017183,\n \"acc_norm\": 0.22388059701492538,\n \"acc_norm_stderr\": 0.029475250236017183\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n \"acc_stderr\": 0.03696584317010601,\n \"acc_norm\": 0.3433734939759036,\n \"acc_norm_stderr\": 0.03696584317010601\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080508,\n \"mc2\": 0.39616632351350106,\n \"mc2_stderr\": 0.01482764797162176\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5067087608524072,\n \"acc_stderr\": 0.014051220692330349\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/ai-forever/mGPT", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-35-26.065619.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["**/details_harness|winogrande|5_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T22-35-26.065619.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T22_35_26.065619", "path": ["results_2023-12-29T22-35-26.065619.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T22-35-26.065619.parquet"]}]}]} | 2023-12-29T22:37:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ai-forever/mGPT
Dataset automatically created during the evaluation run of model ai-forever/mGPT on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T22:35:26.065619(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ai-forever/mGPT\n\n\n\nDataset automatically created during the evaluation run of model ai-forever/mGPT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:35:26.065619(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ai-forever/mGPT\n\n\n\nDataset automatically created during the evaluation run of model ai-forever/mGPT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:35:26.065619(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ai-forever/mGPT\n\n\n\nDataset automatically created during the evaluation run of model ai-forever/mGPT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T22:35:26.065619(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
ed04254592531045249c709a1c40bdd9ee917a5f |
# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kyujinpy/Sakura-SOLAR-Instruct-DPO-v2](https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct-DPO-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T22:39:58.895628](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v2/blob/main/results_2023-12-29T22-39-58.895628.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6682468010299201,
"acc_stderr": 0.031550102562656,
"acc_norm": 0.6692469699297998,
"acc_norm_stderr": 0.03219064838817908,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7185849753394141,
"mc2_stderr": 0.014985704637518712
},
"harness|arc:challenge|25": {
"acc": 0.6851535836177475,
"acc_stderr": 0.01357265770308495,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907593
},
"harness|hellaswag|10": {
"acc": 0.7136028679545907,
"acc_stderr": 0.004511533039406214,
"acc_norm": 0.8840868352917746,
"acc_norm_stderr": 0.003194665266078602
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236786,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236786
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603347,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603347
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568624,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4022346368715084,
"acc_stderr": 0.01639971673284714,
"acc_norm": 0.4022346368715084,
"acc_norm_stderr": 0.01639971673284714
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341062,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341062
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445803,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445803
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4934810951760104,
"acc_stderr": 0.012769150688867503,
"acc_norm": 0.4934810951760104,
"acc_norm_stderr": 0.012769150688867503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352817,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352817
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857834,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857834
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7185849753394141,
"mc2_stderr": 0.014985704637518712
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370632
},
"harness|gsm8k|5": {
"acc": 0.6376042456406369,
"acc_stderr": 0.013240654263574762
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v2 | [
"region:us"
] | 2023-12-29T22:42:18+00:00 | {"pretty_name": "Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [kyujinpy/Sakura-SOLAR-Instruct-DPO-v2](https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct-DPO-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T22:39:58.895628](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v2/blob/main/results_2023-12-29T22-39-58.895628.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6682468010299201,\n \"acc_stderr\": 0.031550102562656,\n \"acc_norm\": 0.6692469699297998,\n \"acc_norm_stderr\": 0.03219064838817908,\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7185849753394141,\n \"mc2_stderr\": 0.014985704637518712\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6851535836177475,\n \"acc_stderr\": 0.01357265770308495,\n \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907593\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7136028679545907,\n \"acc_stderr\": 0.004511533039406214,\n \"acc_norm\": 0.8840868352917746,\n \"acc_norm_stderr\": 0.003194665266078602\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236786,\n \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236786\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n \"acc_stderr\": 0.021732540689329286,\n \"acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.021732540689329286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603347,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603347\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568624,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568624\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4022346368715084,\n \"acc_stderr\": 0.01639971673284714,\n \"acc_norm\": 0.4022346368715084,\n \"acc_norm_stderr\": 0.01639971673284714\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341062,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341062\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445803,\n \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445803\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352817,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352817\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857834,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857834\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7185849753394141,\n \"mc2_stderr\": 0.014985704637518712\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370632\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6376042456406369,\n \"acc_stderr\": 0.013240654263574762\n }\n}\n```", "repo_url": "https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct-DPO-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T22-39-58.895628.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["**/details_harness|winogrande|5_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T22-39-58.895628.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T22_39_58.895628", "path": ["results_2023-12-29T22-39-58.895628.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T22-39-58.895628.parquet"]}]}]} | 2023-12-29T22:42:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v2
Dataset automatically created during the evaluation run of model kyujinpy/Sakura-SOLAR-Instruct-DPO-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T22:39:58.895628(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v2\n\n\n\nDataset automatically created during the evaluation run of model kyujinpy/Sakura-SOLAR-Instruct-DPO-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:39:58.895628(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v2\n\n\n\nDataset automatically created during the evaluation run of model kyujinpy/Sakura-SOLAR-Instruct-DPO-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T22:39:58.895628(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v2\n\n\n\nDataset automatically created during the evaluation run of model kyujinpy/Sakura-SOLAR-Instruct-DPO-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T22:39:58.895628(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
1427f0a9917ebc15d1dbbfd5d42a84254ef747b0 |
# Dataset Card for Evaluation run of Mihaiii/Pallas-0.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/Pallas-0.5](https://huggingface.co/Mihaiii/Pallas-0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__Pallas-0.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T23:24:20.042854](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Pallas-0.5/blob/main/results_2023-12-29T23-24-20.042854.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7448549905989788,
"acc_stderr": 0.029000653853438103,
"acc_norm": 0.7498008582741917,
"acc_norm_stderr": 0.029547828372766274,
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5688216466539537,
"mc2_stderr": 0.015796140147708485
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893452,
"acc_norm": 0.6476109215017065,
"acc_norm_stderr": 0.01396014260059868
},
"harness|hellaswag|10": {
"acc": 0.6428002389962159,
"acc_stderr": 0.004781950883460502,
"acc_norm": 0.8345947022505477,
"acc_norm_stderr": 0.0037078660457296048
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930387,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8037735849056604,
"acc_stderr": 0.024442388131100817,
"acc_norm": 0.8037735849056604,
"acc_norm_stderr": 0.024442388131100817
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8680555555555556,
"acc_stderr": 0.02830096838204443,
"acc_norm": 0.8680555555555556,
"acc_norm_stderr": 0.02830096838204443
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.03345036916788991,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.03345036916788991
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387533,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387533
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7310344827586207,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.7310344827586207,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6772486772486772,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.6772486772486772,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9,
"acc_stderr": 0.017066403719657255,
"acc_norm": 0.9,
"acc_norm_stderr": 0.017066403719657255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6748768472906403,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.6748768472906403,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284332,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284332
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.01996022556317289,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.01996022556317289
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527041,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527041
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8,
"acc_stderr": 0.020280805062535726,
"acc_norm": 0.8,
"acc_norm_stderr": 0.020280805062535726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.0301144420196681,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.0301144420196681
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8277310924369747,
"acc_stderr": 0.024528664971305424,
"acc_norm": 0.8277310924369747,
"acc_norm_stderr": 0.024528664971305424
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9119266055045872,
"acc_stderr": 0.012150743719481653,
"acc_norm": 0.9119266055045872,
"acc_norm_stderr": 0.012150743719481653
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073315,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.01926932302564027,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.01926932302564027
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208274,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744631,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744631
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.02919980245562281,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.02919980245562281
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631001,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631001
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553855,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553855
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436186,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436186
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9042145593869731,
"acc_stderr": 0.010524031079055838,
"acc_norm": 0.9042145593869731,
"acc_norm_stderr": 0.010524031079055838
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.02115267696657528,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.02115267696657528
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6670391061452514,
"acc_stderr": 0.015761716178397563,
"acc_norm": 0.6670391061452514,
"acc_norm_stderr": 0.015761716178397563
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7973856209150327,
"acc_stderr": 0.023015446877985665,
"acc_norm": 0.7973856209150327,
"acc_norm_stderr": 0.023015446877985665
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8734567901234568,
"acc_stderr": 0.018498600558790906,
"acc_norm": 0.8734567901234568,
"acc_norm_stderr": 0.018498600558790906
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6063829787234043,
"acc_stderr": 0.02914454478159616,
"acc_norm": 0.6063829787234043,
"acc_norm_stderr": 0.02914454478159616
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5834419817470665,
"acc_stderr": 0.01259115324505739,
"acc_norm": 0.5834419817470665,
"acc_norm_stderr": 0.01259115324505739
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8014705882352942,
"acc_stderr": 0.024231013370541083,
"acc_norm": 0.8014705882352942,
"acc_norm_stderr": 0.024231013370541083
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7973856209150327,
"acc_stderr": 0.01626105528374613,
"acc_norm": 0.7973856209150327,
"acc_norm_stderr": 0.01626105528374613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.027265992434429103,
"acc_norm": 0.92,
"acc_norm_stderr": 0.027265992434429103
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5688216466539537,
"mc2_stderr": 0.015796140147708485
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.01095971643524291
},
"harness|gsm8k|5": {
"acc": 0.5989385898407885,
"acc_stderr": 0.01350015892224554
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Mihaiii__Pallas-0.5 | [
"region:us"
] | 2023-12-29T23:26:35+00:00 | {"pretty_name": "Evaluation run of Mihaiii/Pallas-0.5", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mihaiii/Pallas-0.5](https://huggingface.co/Mihaiii/Pallas-0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Pallas-0.5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T23:24:20.042854](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Pallas-0.5/blob/main/results_2023-12-29T23-24-20.042854.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7448549905989788,\n \"acc_stderr\": 0.029000653853438103,\n \"acc_norm\": 0.7498008582741917,\n \"acc_norm_stderr\": 0.029547828372766274,\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5688216466539537,\n \"mc2_stderr\": 0.015796140147708485\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893452,\n \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.01396014260059868\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6428002389962159,\n \"acc_stderr\": 0.004781950883460502,\n \"acc_norm\": 0.8345947022505477,\n \"acc_norm_stderr\": 0.0037078660457296048\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930387,\n \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930387\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.024442388131100817,\n \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.024442388131100817\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8680555555555556,\n \"acc_stderr\": 0.02830096838204443,\n \"acc_norm\": 0.8680555555555556,\n \"acc_norm_stderr\": 0.02830096838204443\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387533,\n \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387533\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6772486772486772,\n \"acc_stderr\": 0.024078943243597016,\n \"acc_norm\": 0.6772486772486772,\n \"acc_norm_stderr\": 0.024078943243597016\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.017066403719657255,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.017066403719657255\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6748768472906403,\n \"acc_stderr\": 0.032957975663112704,\n \"acc_norm\": 0.6748768472906403,\n \"acc_norm_stderr\": 0.032957975663112704\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284332,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284332\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9141414141414141,\n \"acc_stderr\": 0.01996022556317289,\n \"acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.01996022556317289\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527041,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527041\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.020280805062535726,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.020280805062535726\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4222222222222222,\n \"acc_stderr\": 0.0301144420196681,\n \"acc_norm\": 0.4222222222222222,\n \"acc_norm_stderr\": 0.0301144420196681\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8277310924369747,\n \"acc_stderr\": 0.024528664971305424,\n \"acc_norm\": 0.8277310924369747,\n \"acc_norm_stderr\": 0.024528664971305424\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9119266055045872,\n \"acc_stderr\": 0.012150743719481653,\n \"acc_norm\": 0.9119266055045872,\n \"acc_norm_stderr\": 0.012150743719481653\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.032757734861009996,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.032757734861009996\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.01926932302564027,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.01926932302564027\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744631,\n \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744631\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.02919980245562281,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.02919980245562281\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243631001,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243631001\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553855,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553855\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.017456987872436186,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.017456987872436186\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9042145593869731,\n \"acc_stderr\": 0.010524031079055838,\n \"acc_norm\": 0.9042145593869731,\n \"acc_norm_stderr\": 0.010524031079055838\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.02115267696657528,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.02115267696657528\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6670391061452514,\n \"acc_stderr\": 0.015761716178397563,\n \"acc_norm\": 0.6670391061452514,\n \"acc_norm_stderr\": 0.015761716178397563\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7973856209150327,\n \"acc_stderr\": 0.023015446877985665,\n \"acc_norm\": 0.7973856209150327,\n \"acc_norm_stderr\": 0.023015446877985665\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8734567901234568,\n \"acc_stderr\": 0.018498600558790906,\n \"acc_norm\": 0.8734567901234568,\n \"acc_norm_stderr\": 0.018498600558790906\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6063829787234043,\n \"acc_stderr\": 0.02914454478159616,\n \"acc_norm\": 0.6063829787234043,\n \"acc_norm_stderr\": 0.02914454478159616\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5834419817470665,\n \"acc_stderr\": 0.01259115324505739,\n \"acc_norm\": 0.5834419817470665,\n \"acc_norm_stderr\": 0.01259115324505739\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8014705882352942,\n \"acc_stderr\": 0.024231013370541083,\n \"acc_norm\": 0.8014705882352942,\n \"acc_norm_stderr\": 0.024231013370541083\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7973856209150327,\n \"acc_stderr\": 0.01626105528374613,\n \"acc_norm\": 0.7973856209150327,\n \"acc_norm_stderr\": 0.01626105528374613\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.027265992434429103,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.027265992434429103\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5688216466539537,\n \"mc2_stderr\": 0.015796140147708485\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.01095971643524291\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5989385898407885,\n \"acc_stderr\": 0.01350015892224554\n }\n}\n```", "repo_url": "https://huggingface.co/Mihaiii/Pallas-0.5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|arc:challenge|25_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|gsm8k|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hellaswag|10_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T23-24-20.042854.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["**/details_harness|winogrande|5_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T23-24-20.042854.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T23_24_20.042854", "path": ["results_2023-12-29T23-24-20.042854.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T23-24-20.042854.parquet"]}]}]} | 2023-12-29T23:26:56+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Mihaiii/Pallas-0.5
Dataset automatically created during the evaluation run of model Mihaiii/Pallas-0.5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T23:24:20.042854(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Mihaiii/Pallas-0.5\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T23:24:20.042854(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Mihaiii/Pallas-0.5\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T23:24:20.042854(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
175,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mihaiii/Pallas-0.5\n\n\n\nDataset automatically created during the evaluation run of model Mihaiii/Pallas-0.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T23:24:20.042854(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
1015a8051c57eaf167066e4d56f58e2b15599fdd |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This is a subset of the first 20k entries of SlimOrca-Dedup.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Myself & OpenOrca
- **Funded by [optional]:** Myself
- **Shared by [optional]:** Myself
- **Language(s) (NLP):** English
- **License:** MIT
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | Thermostatic/miniorca | [
"license:mit",
"region:us"
] | 2023-12-29T23:37:38+00:00 | {"license": "mit"} | 2023-12-29T23:40:25+00:00 | [] | [] | TAGS
#license-mit #region-us
|
# Dataset Card for Dataset Name
This is a subset of the first 20k entries of SlimOrca-Dedup.
## Dataset Details
### Dataset Description
- Curated by: Myself & OpenOrca
- Funded by [optional]: Myself
- Shared by [optional]: Myself
- Language(s) (NLP): English
- License: MIT
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis is a subset of the first 20k entries of SlimOrca-Dedup.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: Myself & OpenOrca\n- Funded by [optional]: Myself\n- Shared by [optional]: Myself\n- Language(s) (NLP): English\n- License: MIT",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#license-mit #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis is a subset of the first 20k entries of SlimOrca-Dedup.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: Myself & OpenOrca\n- Funded by [optional]: Myself\n- Shared by [optional]: Myself\n- Language(s) (NLP): English\n- License: MIT",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
11,
29,
4,
52,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#license-mit #region-us \n# Dataset Card for Dataset Name\n\n\n\nThis is a subset of the first 20k entries of SlimOrca-Dedup.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: Myself & OpenOrca\n- Funded by [optional]: Myself\n- Shared by [optional]: Myself\n- Language(s) (NLP): English\n- License: MIT### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
d654bc044bac558f572ed6cff3d169155082b3a9 |
# Dataset Card for Evaluation run of scb10x/typhoon-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [scb10x/typhoon-7b](https://huggingface.co/scb10x/typhoon-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_scb10x__typhoon-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T23:54:04.797945](https://huggingface.co/datasets/open-llm-leaderboard/details_scb10x__typhoon-7b/blob/main/results_2023-12-29T23-54-04.797945.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5930329978732358,
"acc_stderr": 0.03323235696154905,
"acc_norm": 0.5989902156875104,
"acc_norm_stderr": 0.03392042546889035,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.01534540948555798,
"mc2": 0.4052198339452636,
"mc2_stderr": 0.014069431569242152
},
"harness|arc:challenge|25": {
"acc": 0.5435153583617748,
"acc_stderr": 0.01455594976049644,
"acc_norm": 0.5853242320819113,
"acc_norm_stderr": 0.014397070564409174
},
"harness|hellaswag|10": {
"acc": 0.6101374228241386,
"acc_stderr": 0.004867221634461272,
"acc_norm": 0.8154750049790879,
"acc_norm_stderr": 0.0038711896202760685
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.042992689054808644,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.042992689054808644
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.02964781353936524,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.02964781353936524
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.02590608702131929,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.02590608702131929
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365907,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365907
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915331,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915331
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215638,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215638
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7706422018348624,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.7706422018348624,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.02390232554956039,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.02390232554956039
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593524,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593524
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.02541600377316555,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.02541600377316555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005566,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005566
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.0264930332251459,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.0264930332251459
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087375,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087375
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.02993534270787774,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.02993534270787774
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.01945076843250551,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.01945076843250551
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.01534540948555798,
"mc2": 0.4052198339452636,
"mc2_stderr": 0.014069431569242152
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237992
},
"harness|gsm8k|5": {
"acc": 0.3161485974222896,
"acc_stderr": 0.012807630673451495
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_scb10x__typhoon-7b | [
"region:us"
] | 2023-12-29T23:56:22+00:00 | {"pretty_name": "Evaluation run of scb10x/typhoon-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [scb10x/typhoon-7b](https://huggingface.co/scb10x/typhoon-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_scb10x__typhoon-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-29T23:54:04.797945](https://huggingface.co/datasets/open-llm-leaderboard/details_scb10x__typhoon-7b/blob/main/results_2023-12-29T23-54-04.797945.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5930329978732358,\n \"acc_stderr\": 0.03323235696154905,\n \"acc_norm\": 0.5989902156875104,\n \"acc_norm_stderr\": 0.03392042546889035,\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.01534540948555798,\n \"mc2\": 0.4052198339452636,\n \"mc2_stderr\": 0.014069431569242152\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.01455594976049644,\n \"acc_norm\": 0.5853242320819113,\n \"acc_norm_stderr\": 0.014397070564409174\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6101374228241386,\n \"acc_stderr\": 0.004867221634461272,\n \"acc_norm\": 0.8154750049790879,\n \"acc_norm_stderr\": 0.0038711896202760685\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.02964781353936524,\n \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.02964781353936524\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936336,\n \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936336\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n \"acc_stderr\": 0.02590608702131929,\n \"acc_norm\": 0.7064516129032258,\n \"acc_norm_stderr\": 0.02590608702131929\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365907,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365907\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915331,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915331\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215638,\n \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215638\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7706422018348624,\n \"acc_stderr\": 0.018025349724618684,\n \"acc_norm\": 0.7706422018348624,\n \"acc_norm_stderr\": 0.018025349724618684\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923393,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923393\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.02390232554956039,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.02390232554956039\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n \"acc_stderr\": 0.014648172749593524,\n \"acc_norm\": 0.7867177522349936,\n \"acc_norm_stderr\": 0.014648172749593524\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.02541600377316555,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.02541600377316555\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n \"acc_stderr\": 0.015476515438005566,\n \"acc_norm\": 0.3106145251396648,\n \"acc_norm_stderr\": 0.015476515438005566\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.0264930332251459,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.0264930332251459\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n \"acc_stderr\": 0.012604960816087375,\n \"acc_norm\": 0.4198174706649283,\n \"acc_norm_stderr\": 0.012604960816087375\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.02993534270787774,\n \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.02993534270787774\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.01945076843250551,\n \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.01945076843250551\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.01534540948555798,\n \"mc2\": 0.4052198339452636,\n \"mc2_stderr\": 0.014069431569242152\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237992\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3161485974222896,\n \"acc_stderr\": 0.012807630673451495\n }\n}\n```", "repo_url": "https://huggingface.co/scb10x/typhoon-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|arc:challenge|25_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|gsm8k|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hellaswag|10_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-29T23-54-04.797945.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["**/details_harness|winogrande|5_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-29T23-54-04.797945.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_29T23_54_04.797945", "path": ["results_2023-12-29T23-54-04.797945.parquet"]}, {"split": "latest", "path": ["results_2023-12-29T23-54-04.797945.parquet"]}]}]} | 2023-12-29T23:56:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of scb10x/typhoon-7b
Dataset automatically created during the evaluation run of model scb10x/typhoon-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-29T23:54:04.797945(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of scb10x/typhoon-7b\n\n\n\nDataset automatically created during the evaluation run of model scb10x/typhoon-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T23:54:04.797945(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of scb10x/typhoon-7b\n\n\n\nDataset automatically created during the evaluation run of model scb10x/typhoon-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-29T23:54:04.797945(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of scb10x/typhoon-7b\n\n\n\nDataset automatically created during the evaluation run of model scb10x/typhoon-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-29T23:54:04.797945(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
92740d38a98dce2f166d58adfbf243ad403053fe | test | Theastr4/Test_NER | [
"region:us"
] | 2023-12-30T00:08:43+00:00 | {} | 2023-12-30T00:20:09+00:00 | [] | [] | TAGS
#region-us
| test | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
9623c57029f624578d7ff3140b7fea6938eb541b |
This is a demo constructed dataset for alignment/preference learning.
With paritially handcrafted questions (prompts), the answers are genreated by the [phi-2](https://huggingface.co/microsoft/phi-2) model with temperature `0.2` and the answers are scores select by the [deberta-large-v2](https://huggingface.co/OpenAssistant/reward-model-deberta-v3-large-v2).
The dataset containing questions and the selected answers from highest to lowest, decoding with rejection sampling `K=8`.
Example loading:
```python
import datasets
ds = datasets.load_dataset('yizhilll/demo_rejection_sampling_QA_phi-2_deberta-v3-large-v2_temp0.2')
print(ds)
``` | yizhilll/demo_rejection_sampling_QA_phi-2_deberta-v3-large-v2_temp0.2 | [
"task_categories:text-generation",
"size_categories:n<1K",
"language:en",
"region:us"
] | 2023-12-30T00:09:51+00:00 | {"language": ["en"], "size_categories": ["n<1K"], "task_categories": ["text-generation"], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "base_follow_difficulty", "dtype": "string"}, {"name": "tag", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "selected_answer", "dtype": "string"}, {"name": "selected_reward", "dtype": "float64"}, {"name": "rejected_answers", "sequence": "string"}, {"name": "rejected_rewards", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 29297, "num_examples": 10}], "download_size": 20255, "dataset_size": 29297}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-30T00:18:08+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-n<1K #language-English #region-us
|
This is a demo constructed dataset for alignment/preference learning.
With paritially handcrafted questions (prompts), the answers are genreated by the phi-2 model with temperature '0.2' and the answers are scores select by the deberta-large-v2.
The dataset containing questions and the selected answers from highest to lowest, decoding with rejection sampling 'K=8'.
Example loading:
| [] | [
"TAGS\n#task_categories-text-generation #size_categories-n<1K #language-English #region-us \n"
] | [
31
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-n<1K #language-English #region-us \n"
] |
e887a486655719003a4bb89af2890839e7638699 |
# Dataset Card for Evaluation run of beowolx/CodeNinja-1.0-OpenChat-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [beowolx/CodeNinja-1.0-OpenChat-7B](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beowolx__CodeNinja-1.0-OpenChat-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T00:15:01.452444](https://huggingface.co/datasets/open-llm-leaderboard/details_beowolx__CodeNinja-1.0-OpenChat-7B/blob/main/results_2023-12-30T00-15-01.452444.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6402511076573505,
"acc_stderr": 0.03227558523986021,
"acc_norm": 0.64104515726518,
"acc_norm_stderr": 0.03293549849526372,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.47161024077294855,
"mc2_stderr": 0.014885155226330158
},
"harness|arc:challenge|25": {
"acc": 0.6015358361774744,
"acc_stderr": 0.014306946052735565,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.0140702655192688
},
"harness|hellaswag|10": {
"acc": 0.6398127862975503,
"acc_stderr": 0.0047907346837045865,
"acc_norm": 0.8364867556263692,
"acc_norm_stderr": 0.0036907745636380125
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554859,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554859
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778415,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778415
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768756,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768756
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094753,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.01570349834846177,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.01570349834846177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.038448761397852714,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.038448761397852714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165623,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165623
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899133,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468348,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397112,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101001,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101001
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.47161024077294855,
"mc2_stderr": 0.014885155226330158
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.011285013754047444
},
"harness|gsm8k|5": {
"acc": 0.66565579984837,
"acc_stderr": 0.012994634003332766
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_beowolx__CodeNinja-1.0-OpenChat-7B | [
"region:us"
] | 2023-12-30T00:17:16+00:00 | {"pretty_name": "Evaluation run of beowolx/CodeNinja-1.0-OpenChat-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [beowolx/CodeNinja-1.0-OpenChat-7B](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beowolx__CodeNinja-1.0-OpenChat-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T00:15:01.452444](https://huggingface.co/datasets/open-llm-leaderboard/details_beowolx__CodeNinja-1.0-OpenChat-7B/blob/main/results_2023-12-30T00-15-01.452444.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6402511076573505,\n \"acc_stderr\": 0.03227558523986021,\n \"acc_norm\": 0.64104515726518,\n \"acc_norm_stderr\": 0.03293549849526372,\n \"mc1\": 0.31334149326805383,\n \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.47161024077294855,\n \"mc2_stderr\": 0.014885155226330158\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735565,\n \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.0140702655192688\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6398127862975503,\n \"acc_stderr\": 0.0047907346837045865,\n \"acc_norm\": 0.8364867556263692,\n \"acc_norm_stderr\": 0.0036907745636380125\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778415,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778415\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768756,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768756\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094753,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094753\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.01570349834846177,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.01570349834846177\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.038448761397852714,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.038448761397852714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165623,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165623\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.013964393769899133,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.013964393769899133\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468348,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468348\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n \"acc_stderr\": 0.015366860386397112,\n \"acc_norm\": 0.3027932960893855,\n \"acc_norm_stderr\": 0.015366860386397112\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n \"acc_stderr\": 0.012753716929101001,\n \"acc_norm\": 0.4745762711864407,\n \"acc_norm_stderr\": 0.012753716929101001\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.47161024077294855,\n \"mc2_stderr\": 0.014885155226330158\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047444\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.66565579984837,\n \"acc_stderr\": 0.012994634003332766\n }\n}\n```", "repo_url": "https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|arc:challenge|25_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|gsm8k|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hellaswag|10_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T00-15-01.452444.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["**/details_harness|winogrande|5_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T00-15-01.452444.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T00_15_01.452444", "path": ["results_2023-12-30T00-15-01.452444.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T00-15-01.452444.parquet"]}]}]} | 2023-12-30T00:17:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of beowolx/CodeNinja-1.0-OpenChat-7B
Dataset automatically created during the evaluation run of model beowolx/CodeNinja-1.0-OpenChat-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T00:15:01.452444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of beowolx/CodeNinja-1.0-OpenChat-7B\n\n\n\nDataset automatically created during the evaluation run of model beowolx/CodeNinja-1.0-OpenChat-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T00:15:01.452444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of beowolx/CodeNinja-1.0-OpenChat-7B\n\n\n\nDataset automatically created during the evaluation run of model beowolx/CodeNinja-1.0-OpenChat-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T00:15:01.452444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of beowolx/CodeNinja-1.0-OpenChat-7B\n\n\n\nDataset automatically created during the evaluation run of model beowolx/CodeNinja-1.0-OpenChat-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T00:15:01.452444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
ec1bb02257c53fc131424d276c9a757f0a6cfb0e |
# Dataset Card for Evaluation run of jikaixuan/test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jikaixuan/test](https://huggingface.co/jikaixuan/test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jikaixuan__test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T00:21:29.315161](https://huggingface.co/datasets/open-llm-leaderboard/details_jikaixuan__test/blob/main/results_2023-12-30T00-21-29.315161.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6083156130977855,
"acc_stderr": 0.03325697047846345,
"acc_norm": 0.6149057049790325,
"acc_norm_stderr": 0.03395674600613233,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.575115982039762,
"mc2_stderr": 0.015744867615337492
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.01435639941800912,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192594
},
"harness|hellaswag|10": {
"acc": 0.6490738896634136,
"acc_stderr": 0.004762844770909858,
"acc_norm": 0.8441545508862777,
"acc_norm_stderr": 0.003619674864035017
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646775,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646775
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454805,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454805
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045803,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588674,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588674
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965835,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965835
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.01492744710193715,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.01492744710193715
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242826,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242826
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882117,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4315514993481095,
"acc_stderr": 0.01265000799946388,
"acc_norm": 0.4315514993481095,
"acc_norm_stderr": 0.01265000799946388
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.019659922493623343,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.019659922493623343
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.575115982039762,
"mc2_stderr": 0.015744867615337492
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.2721758908263836,
"acc_stderr": 0.012259714035164548
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jikaixuan__test | [
"region:us"
] | 2023-12-30T00:23:44+00:00 | {"pretty_name": "Evaluation run of jikaixuan/test", "dataset_summary": "Dataset automatically created during the evaluation run of model [jikaixuan/test](https://huggingface.co/jikaixuan/test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jikaixuan__test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T00:21:29.315161](https://huggingface.co/datasets/open-llm-leaderboard/details_jikaixuan__test/blob/main/results_2023-12-30T00-21-29.315161.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6083156130977855,\n \"acc_stderr\": 0.03325697047846345,\n \"acc_norm\": 0.6149057049790325,\n \"acc_norm_stderr\": 0.03395674600613233,\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.575115982039762,\n \"mc2_stderr\": 0.015744867615337492\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.01435639941800912,\n \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192594\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6490738896634136,\n \"acc_stderr\": 0.004762844770909858,\n \"acc_norm\": 0.8441545508862777,\n \"acc_norm_stderr\": 0.003619674864035017\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646775,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646775\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454805,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454805\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588674,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588674\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965835,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965835\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.01492744710193715,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.01492744710193715\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242826,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242826\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882117,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882117\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n \"acc_stderr\": 0.01265000799946388,\n \"acc_norm\": 0.4315514993481095,\n \"acc_norm_stderr\": 0.01265000799946388\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.019659922493623343,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.019659922493623343\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.575115982039762,\n \"mc2_stderr\": 0.015744867615337492\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2721758908263836,\n \"acc_stderr\": 0.012259714035164548\n }\n}\n```", "repo_url": "https://huggingface.co/jikaixuan/test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|arc:challenge|25_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|gsm8k|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hellaswag|10_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T00-21-29.315161.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["**/details_harness|winogrande|5_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T00-21-29.315161.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T00_21_29.315161", "path": ["results_2023-12-30T00-21-29.315161.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T00-21-29.315161.parquet"]}]}]} | 2023-12-30T00:24:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jikaixuan/test
Dataset automatically created during the evaluation run of model jikaixuan/test on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T00:21:29.315161(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jikaixuan/test\n\n\n\nDataset automatically created during the evaluation run of model jikaixuan/test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T00:21:29.315161(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jikaixuan/test\n\n\n\nDataset automatically created during the evaluation run of model jikaixuan/test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T00:21:29.315161(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
171,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jikaixuan/test\n\n\n\nDataset automatically created during the evaluation run of model jikaixuan/test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T00:21:29.315161(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
1fa21335b1db441a85c6e8c0cc96cc90ddd24428 |
# Dataset card for test_dataset_huggingface_ADE20k_format_v1
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset description](#dataset-description)
- [Dataset categories](#dataset-categories)
## Dataset description
- **Homepage:** https://segments.ai/Lit4pCol4b/test_dataset_huggingface_ADE20k_format_v1
This dataset was created using [Segments.ai](https://segments.ai). It can be found [here](https://segments.ai/Lit4pCol4b/test_dataset_huggingface_ADE20k_format_v1).
## Dataset categories
| Id | Name | Description |
| --- | ---- | ----------- |
| 1 | objeto_interes | - |
| 2 | agua | - |
| Lit4pCol4b/test_dataset_huggingface_ADE20k_format_v1 | [
"task_categories:image-segmentation",
"region:us"
] | 2023-12-30T00:54:16+00:00 | {"task_categories": ["image-segmentation"]} | 2023-12-30T13:36:08+00:00 | [] | [] | TAGS
#task_categories-image-segmentation #region-us
| Dataset card for test\_dataset\_huggingface\_ADE20k\_format\_v1
===============================================================
Table of Contents
-----------------
* Table of Contents
* Dataset description
* Dataset categories
Dataset description
-------------------
* Homepage: URL
This dataset was created using URL. It can be found here.
Dataset categories
------------------
Id: 1, Name: objeto\_interes, Description: -
Id: 2, Name: agua, Description: -
| [] | [
"TAGS\n#task_categories-image-segmentation #region-us \n"
] | [
18
] | [
"passage: TAGS\n#task_categories-image-segmentation #region-us \n"
] |
efa5b2d9414d1f7d4455a7c8a712261b79639ed7 |
# Dataset Card for Evaluation run of Undi95/Mistral-11B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Undi95/Mistral-11B-v0.1](https://huggingface.co/Undi95/Mistral-11B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Mistral-11B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T00:55:47.571163](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-v0.1/blob/main/results_2023-12-30T00-55-47.571163.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6300139074610193,
"acc_stderr": 0.03239200090048791,
"acc_norm": 0.6378790325146357,
"acc_norm_stderr": 0.03306276365916844,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.4066832234739293,
"mc2_stderr": 0.014223545486867587
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.014515573873348902,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.014342036483436177
},
"harness|hellaswag|10": {
"acc": 0.6128261302529376,
"acc_stderr": 0.004861084534087025,
"acc_norm": 0.8116908982274448,
"acc_norm_stderr": 0.0039015979142464933
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115979,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115979
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266854,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266854
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001505,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30502793296089387,
"acc_stderr": 0.015398723510916715,
"acc_norm": 0.30502793296089387,
"acc_norm_stderr": 0.015398723510916715
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4322033898305085,
"acc_stderr": 0.012652297777114968,
"acc_norm": 0.4322033898305085,
"acc_norm_stderr": 0.012652297777114968
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545436,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545436
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299962,
"mc2": 0.4066832234739293,
"mc2_stderr": 0.014223545486867587
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
},
"harness|gsm8k|5": {
"acc": 0.266868840030326,
"acc_stderr": 0.012183780551887955
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Undi95__Mistral-11B-v0.1 | [
"region:us"
] | 2023-12-30T00:58:05+00:00 | {"pretty_name": "Evaluation run of Undi95/Mistral-11B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Undi95/Mistral-11B-v0.1](https://huggingface.co/Undi95/Mistral-11B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Mistral-11B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T00:55:47.571163](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-v0.1/blob/main/results_2023-12-30T00-55-47.571163.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6300139074610193,\n \"acc_stderr\": 0.03239200090048791,\n \"acc_norm\": 0.6378790325146357,\n \"acc_norm_stderr\": 0.03306276365916844,\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.4066832234739293,\n \"mc2_stderr\": 0.014223545486867587\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348902,\n \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.014342036483436177\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6128261302529376,\n \"acc_stderr\": 0.004861084534087025,\n \"acc_norm\": 0.8116908982274448,\n \"acc_norm_stderr\": 0.0039015979142464933\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266854,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266854\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.03350991604696043,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.03350991604696043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30502793296089387,\n \"acc_stderr\": 0.015398723510916715,\n \"acc_norm\": 0.30502793296089387,\n \"acc_norm_stderr\": 0.015398723510916715\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4322033898305085,\n \"acc_stderr\": 0.012652297777114968,\n \"acc_norm\": 0.4322033898305085,\n \"acc_norm_stderr\": 0.012652297777114968\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545436,\n \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545436\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299962,\n \"mc2\": 0.4066832234739293,\n \"mc2_stderr\": 0.014223545486867587\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.266868840030326,\n \"acc_stderr\": 0.012183780551887955\n }\n}\n```", "repo_url": "https://huggingface.co/Undi95/Mistral-11B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|arc:challenge|25_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|gsm8k|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hellaswag|10_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T00-55-47.571163.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["**/details_harness|winogrande|5_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T00-55-47.571163.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T00_55_47.571163", "path": ["results_2023-12-30T00-55-47.571163.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T00-55-47.571163.parquet"]}]}]} | 2023-12-30T00:58:26+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Undi95/Mistral-11B-v0.1
Dataset automatically created during the evaluation run of model Undi95/Mistral-11B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T00:55:47.571163(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Undi95/Mistral-11B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model Undi95/Mistral-11B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T00:55:47.571163(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Undi95/Mistral-11B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model Undi95/Mistral-11B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T00:55:47.571163(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Undi95/Mistral-11B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model Undi95/Mistral-11B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T00:55:47.571163(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
2c28a6dacf82fa92b4b2512e1cf639c26c346366 |
# Dataset Card for Evaluation run of Walmart-the-bag/Influxient-4x13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Walmart-the-bag/Influxient-4x13B](https://huggingface.co/Walmart-the-bag/Influxient-4x13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Walmart-the-bag__Influxient-4x13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T01:10:07.093239](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Influxient-4x13B/blob/main/results_2023-12-30T01-10-07.093239.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5727072313721517,
"acc_stderr": 0.033466156465793005,
"acc_norm": 0.5776499509226207,
"acc_norm_stderr": 0.03415178949023358,
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405334,
"mc2": 0.5410446803363212,
"mc2_stderr": 0.0155300726933085
},
"harness|arc:challenge|25": {
"acc": 0.5784982935153583,
"acc_stderr": 0.014430197069326023,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.6480780720971918,
"acc_stderr": 0.004765937515197188,
"acc_norm": 0.834196375224059,
"acc_norm_stderr": 0.0037114419828661784
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.03784271932887467,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.03784271932887467
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.024373197867983067,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.024373197867983067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552746,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552746
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5487179487179488,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.5487179487179488,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069422,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069422
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489284,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489284
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946005,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.026113749361310345,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.026113749361310345
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971642,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971642
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.026406145973625676,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.026406145973625676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811945,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811945
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5898692810457516,
"acc_stderr": 0.019898412717635906,
"acc_norm": 0.5898692810457516,
"acc_norm_stderr": 0.019898412717635906
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505418,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505418
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405334,
"mc2": 0.5410446803363212,
"mc2_stderr": 0.0155300726933085
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759987
},
"harness|gsm8k|5": {
"acc": 0.3305534495830174,
"acc_stderr": 0.012957496367085026
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Walmart-the-bag__Influxient-4x13B | [
"region:us"
] | 2023-12-30T01:12:31+00:00 | {"pretty_name": "Evaluation run of Walmart-the-bag/Influxient-4x13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Walmart-the-bag/Influxient-4x13B](https://huggingface.co/Walmart-the-bag/Influxient-4x13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Walmart-the-bag__Influxient-4x13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T01:10:07.093239](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Influxient-4x13B/blob/main/results_2023-12-30T01-10-07.093239.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5727072313721517,\n \"acc_stderr\": 0.033466156465793005,\n \"acc_norm\": 0.5776499509226207,\n \"acc_norm_stderr\": 0.03415178949023358,\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405334,\n \"mc2\": 0.5410446803363212,\n \"mc2_stderr\": 0.0155300726933085\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5784982935153583,\n \"acc_stderr\": 0.014430197069326023,\n \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6480780720971918,\n \"acc_stderr\": 0.004765937515197188,\n \"acc_norm\": 0.834196375224059,\n \"acc_norm_stderr\": 0.0037114419828661784\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983067,\n \"acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983067\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n \"acc_stderr\": 0.026985289576552746,\n \"acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.026985289576552746\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5487179487179488,\n \"acc_stderr\": 0.025230381238934833,\n \"acc_norm\": 0.5487179487179488,\n \"acc_norm_stderr\": 0.025230381238934833\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069422,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069422\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794089,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794089\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489284,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489284\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n \"acc_stderr\": 0.014987270640946005,\n \"acc_norm\": 0.7726692209450831,\n \"acc_norm_stderr\": 0.014987270640946005\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.026113749361310345,\n \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.026113749361310345\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n \"acc_stderr\": 0.027368078243971642,\n \"acc_norm\": 0.6334405144694534,\n \"acc_norm_stderr\": 0.027368078243971642\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.026406145973625676,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.026406145973625676\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n \"acc_stderr\": 0.012628393551811945,\n \"acc_norm\": 0.4256844850065189,\n \"acc_norm_stderr\": 0.012628393551811945\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5898692810457516,\n \"acc_stderr\": 0.019898412717635906,\n \"acc_norm\": 0.5898692810457516,\n \"acc_norm_stderr\": 0.019898412717635906\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.04582004841505418,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.04582004841505418\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n \"mc1_stderr\": 0.016943535128405334,\n \"mc2\": 0.5410446803363212,\n \"mc2_stderr\": 0.0155300726933085\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3305534495830174,\n \"acc_stderr\": 0.012957496367085026\n }\n}\n```", "repo_url": "https://huggingface.co/Walmart-the-bag/Influxient-4x13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|arc:challenge|25_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|gsm8k|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hellaswag|10_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T01-10-07.093239.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["**/details_harness|winogrande|5_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T01-10-07.093239.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T01_10_07.093239", "path": ["results_2023-12-30T01-10-07.093239.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T01-10-07.093239.parquet"]}]}]} | 2023-12-30T01:12:54+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Walmart-the-bag/Influxient-4x13B
Dataset automatically created during the evaluation run of model Walmart-the-bag/Influxient-4x13B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T01:10:07.093239(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Walmart-the-bag/Influxient-4x13B\n\n\n\nDataset automatically created during the evaluation run of model Walmart-the-bag/Influxient-4x13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T01:10:07.093239(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Walmart-the-bag/Influxient-4x13B\n\n\n\nDataset automatically created during the evaluation run of model Walmart-the-bag/Influxient-4x13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T01:10:07.093239(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Walmart-the-bag/Influxient-4x13B\n\n\n\nDataset automatically created during the evaluation run of model Walmart-the-bag/Influxient-4x13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T01:10:07.093239(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
e321d0301650c3dbf4effcef42ff68f11ad81609 |
# Dataset Card for Evaluation run of namirocks/student-model-13b-ep3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [namirocks/student-model-13b-ep3](https://huggingface.co/namirocks/student-model-13b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_namirocks__student-model-13b-ep3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T01:37:20.077989](https://huggingface.co/datasets/open-llm-leaderboard/details_namirocks__student-model-13b-ep3/blob/main/results_2023-12-30T01-37-20.077989.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5621545466705268,
"acc_stderr": 0.03351539520737431,
"acc_norm": 0.5727655950528345,
"acc_norm_stderr": 0.03442395762278095,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871105,
"mc2": 0.35003126952306707,
"mc2_stderr": 0.014347219852780793
},
"harness|arc:challenge|25": {
"acc": 0.43856655290102387,
"acc_stderr": 0.014500682618212865,
"acc_norm": 0.46501706484641636,
"acc_norm_stderr": 0.01457558392201966
},
"harness|hellaswag|10": {
"acc": 0.6061541525592511,
"acc_stderr": 0.004876028037941937,
"acc_norm": 0.8036247759410476,
"acc_norm_stderr": 0.003964437012249992
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.03289477330098617,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.03289477330098617
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245265,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245265
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310233,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310233
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.0184152863514164,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.0184152863514164
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703642,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703642
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543678,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543678
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686936,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686936
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.026296227915613663,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.026296227915613663
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.01578800719018588,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.01578800719018588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159617,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159617
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971635,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971635
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.026869490744815247,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.026869490744815247
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.02955545423677886,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.02955545423677886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.01262078515588599,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.01262078515588599
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.03035230339535196,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.03035230339535196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.019955975145835542,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.019955975145835542
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871105,
"mc2": 0.35003126952306707,
"mc2_stderr": 0.014347219852780793
},
"harness|winogrande|5": {
"acc": 0.7221783741120757,
"acc_stderr": 0.012588918183871596
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_namirocks__student-model-13b-ep3 | [
"region:us"
] | 2023-12-30T01:39:40+00:00 | {"pretty_name": "Evaluation run of namirocks/student-model-13b-ep3", "dataset_summary": "Dataset automatically created during the evaluation run of model [namirocks/student-model-13b-ep3](https://huggingface.co/namirocks/student-model-13b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_namirocks__student-model-13b-ep3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T01:37:20.077989](https://huggingface.co/datasets/open-llm-leaderboard/details_namirocks__student-model-13b-ep3/blob/main/results_2023-12-30T01-37-20.077989.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5621545466705268,\n \"acc_stderr\": 0.03351539520737431,\n \"acc_norm\": 0.5727655950528345,\n \"acc_norm_stderr\": 0.03442395762278095,\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871105,\n \"mc2\": 0.35003126952306707,\n \"mc2_stderr\": 0.014347219852780793\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.43856655290102387,\n \"acc_stderr\": 0.014500682618212865,\n \"acc_norm\": 0.46501706484641636,\n \"acc_norm_stderr\": 0.01457558392201966\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6061541525592511,\n \"acc_stderr\": 0.004876028037941937,\n \"acc_norm\": 0.8036247759410476,\n \"acc_norm_stderr\": 0.003964437012249992\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6919191919191919,\n \"acc_stderr\": 0.03289477330098617,\n \"acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.03289477330098617\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245265,\n \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245265\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310233,\n \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310233\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7559633027522936,\n \"acc_stderr\": 0.0184152863514164,\n \"acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.0184152863514164\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703642,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703642\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n \"acc_stderr\": 0.015411308769686936,\n \"acc_norm\": 0.7535121328224776,\n \"acc_norm_stderr\": 0.015411308769686936\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.026296227915613663,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.026296227915613663\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n \"acc_stderr\": 0.01578800719018588,\n \"acc_norm\": 0.33519553072625696,\n \"acc_norm_stderr\": 0.01578800719018588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159617,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159617\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n \"acc_stderr\": 0.027368078243971635,\n \"acc_norm\": 0.6334405144694534,\n \"acc_norm_stderr\": 0.027368078243971635\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.026869490744815247,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.026869490744815247\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4326241134751773,\n \"acc_stderr\": 0.02955545423677886,\n \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.02955545423677886\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n \"acc_stderr\": 0.01262078515588599,\n \"acc_norm\": 0.423728813559322,\n \"acc_norm_stderr\": 0.01262078515588599\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.03035230339535196,\n \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.03035230339535196\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.019955975145835542,\n \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.019955975145835542\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.7711442786069652,\n \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871105,\n \"mc2\": 0.35003126952306707,\n \"mc2_stderr\": 0.014347219852780793\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871596\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/namirocks/student-model-13b-ep3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|arc:challenge|25_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|gsm8k|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hellaswag|10_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T01-37-20.077989.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["**/details_harness|winogrande|5_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T01-37-20.077989.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T01_37_20.077989", "path": ["results_2023-12-30T01-37-20.077989.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T01-37-20.077989.parquet"]}]}]} | 2023-12-30T01:40:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of namirocks/student-model-13b-ep3
Dataset automatically created during the evaluation run of model namirocks/student-model-13b-ep3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T01:37:20.077989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of namirocks/student-model-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model namirocks/student-model-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T01:37:20.077989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of namirocks/student-model-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model namirocks/student-model-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T01:37:20.077989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of namirocks/student-model-13b-ep3\n\n\n\nDataset automatically created during the evaluation run of model namirocks/student-model-13b-ep3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T01:37:20.077989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
cfae20ee9965b6b18dbffc7b622d28d16056735f |
# Introduction
The "COSER-ASR" Subset is a specialized extract from the "Corpus Oral y Sonoro del Español Rural" (COSER; Fernández-Ordóñez 2005-present), meaning the "Audible Corpus of Spoken Rural Spanish". This dataset has been specifically curated to facilitate the fine-tuning of Whisper, an automatic speech recognition system. For this purpose, audio and text segments ranging from 3 to 30 seconds have been automatically extracted from the COSER corpus. These segments provide concise and diverse samples of spoken rural Spanish, ideal for training and refining speech recognition models. To ensure manageability and efficient processing, a maximum of 1024 tokens were used in the dataset, striking a balance between comprehensive coverage and computational efficiency.
# Content and Demographic Focus
The original COSER dataset includes 218 transcriptions of semi-structured interviews primarily with elderly, less-educated individuals from rural Spain. These interviews, each averaging around 54 minutes, are rich in dialectal variations and linguistic nuances, offering valuable insights into traditional Spanish dialects.
# Transcription Approach
The "coser" dataset provides multiple layers of transcription to cater to different linguistic and computational needs:
### Original Transcription (sentence):
This is the direct transcription of the audio segments, preserving the original speech as closely as possible and the complete original transcription.
### Phonological Approximation (sentence_fono):
Here, the transcription is modified to reflect the phonological characteristics of the dialectal pronunciation. This version is crucial for understanding the phonetic nuances of rural Spanish dialects.
### Phonological Transcription without Discourse Markers (sentence_fono_sin_marcas):
This transcription removes discourse markers such as laughter, assent, etc., that are typically enclosed in square brackets. It offers a cleaner version focusing solely on the spoken words.
### Orthographic Correspondence (sentence_orto):
This layer provides the standard orthographic equivalent of the words transcribed phonologically. It bridges the gap between dialectal speech and standard Spanish orthography.
### Orthographic Transcription without Discourse Markers (sentence_orto_sin_marcas):
Similar to the phonological version without markers, this transcription provides a standard orthographic text devoid of any discourse markers. This is particularly useful for applications requiring clean text data.
# Limitations
Limitations of this model include the fact that the time intervals in the COSER corpus are not systematically aligned, meaning that there may not be a perfect one-to-one correspondence between the audio and text data.
# Additional Information and Resources
To explore more about the COSER corpus, its methodologies, and the full range of transcriptions, visit http://coser.lllf.uam.es/ and http://coser.lllf.uam.es/transcripcion.php. These resources provide an in-depth look at the COSER project, detailing its comprehensive approach to capturing the linguistic diversity of rural Spanish.
# References
Fernández-Ordóñez, I. (Ed.). (2005-present). Corpus Oral y Sonoro del Español Rural. Retrieved April 15, 2022, from http://www.corpusrural.es/ | johnatanebonilla/coser | [
"task_categories:automatic-speech-recognition",
"task_categories:conversational",
"size_categories:10K<n<100K",
"language:es",
"doi:10.57967/hf/1564",
"region:us"
] | 2023-12-30T01:48:59+00:00 | {"language": ["es"], "size_categories": ["10K<n<100K"], "task_categories": ["automatic-speech-recognition", "conversational"], "pretty_name": "COSER-ASR Subset", "dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "filename", "dtype": "string"}, {"name": "turno_id", "dtype": "int64"}, {"name": "turno_time", "dtype": "string"}, {"name": "sentence", "dtype": "string"}, {"name": "sentence_fono", "dtype": "string"}, {"name": "sentence_fono_sin_marcas", "dtype": "string"}, {"name": "sentence_orto", "dtype": "string"}, {"name": "sentence_orto_sin_marcas", "dtype": "string"}, {"name": "Provincia", "dtype": "string"}, {"name": "Enclave", "dtype": "string"}, {"name": "Fecha", "dtype": "string"}, {"name": "Duraci\u00f3n", "dtype": "string"}, {"name": "Informantes", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4600923777.433, "num_examples": 53971}, {"name": "validation", "num_bytes": 503026194.46, "num_examples": 6689}, {"name": "test", "num_bytes": 486076659.954, "num_examples": 6726}], "download_size": 4707509912, "dataset_size": 5590026631.847}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-03T12:15:06+00:00 | [] | [
"es"
] | TAGS
#task_categories-automatic-speech-recognition #task_categories-conversational #size_categories-10K<n<100K #language-Spanish #doi-10.57967/hf/1564 #region-us
|
# Introduction
The "COSER-ASR" Subset is a specialized extract from the "Corpus Oral y Sonoro del Español Rural" (COSER; Fernández-Ordóñez 2005-present), meaning the "Audible Corpus of Spoken Rural Spanish". This dataset has been specifically curated to facilitate the fine-tuning of Whisper, an automatic speech recognition system. For this purpose, audio and text segments ranging from 3 to 30 seconds have been automatically extracted from the COSER corpus. These segments provide concise and diverse samples of spoken rural Spanish, ideal for training and refining speech recognition models. To ensure manageability and efficient processing, a maximum of 1024 tokens were used in the dataset, striking a balance between comprehensive coverage and computational efficiency.
# Content and Demographic Focus
The original COSER dataset includes 218 transcriptions of semi-structured interviews primarily with elderly, less-educated individuals from rural Spain. These interviews, each averaging around 54 minutes, are rich in dialectal variations and linguistic nuances, offering valuable insights into traditional Spanish dialects.
# Transcription Approach
The "coser" dataset provides multiple layers of transcription to cater to different linguistic and computational needs:
### Original Transcription (sentence):
This is the direct transcription of the audio segments, preserving the original speech as closely as possible and the complete original transcription.
### Phonological Approximation (sentence_fono):
Here, the transcription is modified to reflect the phonological characteristics of the dialectal pronunciation. This version is crucial for understanding the phonetic nuances of rural Spanish dialects.
### Phonological Transcription without Discourse Markers (sentence_fono_sin_marcas):
This transcription removes discourse markers such as laughter, assent, etc., that are typically enclosed in square brackets. It offers a cleaner version focusing solely on the spoken words.
### Orthographic Correspondence (sentence_orto):
This layer provides the standard orthographic equivalent of the words transcribed phonologically. It bridges the gap between dialectal speech and standard Spanish orthography.
### Orthographic Transcription without Discourse Markers (sentence_orto_sin_marcas):
Similar to the phonological version without markers, this transcription provides a standard orthographic text devoid of any discourse markers. This is particularly useful for applications requiring clean text data.
# Limitations
Limitations of this model include the fact that the time intervals in the COSER corpus are not systematically aligned, meaning that there may not be a perfect one-to-one correspondence between the audio and text data.
# Additional Information and Resources
To explore more about the COSER corpus, its methodologies, and the full range of transcriptions, visit URL and URL These resources provide an in-depth look at the COSER project, detailing its comprehensive approach to capturing the linguistic diversity of rural Spanish.
# References
Fernández-Ordóñez, I. (Ed.). (2005-present). Corpus Oral y Sonoro del Español Rural. Retrieved April 15, 2022, from URL | [
"# Introduction\nThe \"COSER-ASR\" Subset is a specialized extract from the \"Corpus Oral y Sonoro del Español Rural\" (COSER; Fernández-Ordóñez 2005-present), meaning the \"Audible Corpus of Spoken Rural Spanish\". This dataset has been specifically curated to facilitate the fine-tuning of Whisper, an automatic speech recognition system. For this purpose, audio and text segments ranging from 3 to 30 seconds have been automatically extracted from the COSER corpus. These segments provide concise and diverse samples of spoken rural Spanish, ideal for training and refining speech recognition models. To ensure manageability and efficient processing, a maximum of 1024 tokens were used in the dataset, striking a balance between comprehensive coverage and computational efficiency.",
"# Content and Demographic Focus\nThe original COSER dataset includes 218 transcriptions of semi-structured interviews primarily with elderly, less-educated individuals from rural Spain. These interviews, each averaging around 54 minutes, are rich in dialectal variations and linguistic nuances, offering valuable insights into traditional Spanish dialects.",
"# Transcription Approach\nThe \"coser\" dataset provides multiple layers of transcription to cater to different linguistic and computational needs:",
"### Original Transcription (sentence): \nThis is the direct transcription of the audio segments, preserving the original speech as closely as possible and the complete original transcription.",
"### Phonological Approximation (sentence_fono): \nHere, the transcription is modified to reflect the phonological characteristics of the dialectal pronunciation. This version is crucial for understanding the phonetic nuances of rural Spanish dialects.",
"### Phonological Transcription without Discourse Markers (sentence_fono_sin_marcas): \nThis transcription removes discourse markers such as laughter, assent, etc., that are typically enclosed in square brackets. It offers a cleaner version focusing solely on the spoken words.",
"### Orthographic Correspondence (sentence_orto): \nThis layer provides the standard orthographic equivalent of the words transcribed phonologically. It bridges the gap between dialectal speech and standard Spanish orthography.",
"### Orthographic Transcription without Discourse Markers (sentence_orto_sin_marcas): \nSimilar to the phonological version without markers, this transcription provides a standard orthographic text devoid of any discourse markers. This is particularly useful for applications requiring clean text data.",
"# Limitations\n\nLimitations of this model include the fact that the time intervals in the COSER corpus are not systematically aligned, meaning that there may not be a perfect one-to-one correspondence between the audio and text data.",
"# Additional Information and Resources\nTo explore more about the COSER corpus, its methodologies, and the full range of transcriptions, visit URL and URL These resources provide an in-depth look at the COSER project, detailing its comprehensive approach to capturing the linguistic diversity of rural Spanish.",
"# References\n\nFernández-Ordóñez, I. (Ed.). (2005-present). Corpus Oral y Sonoro del Español Rural. Retrieved April 15, 2022, from URL"
] | [
"TAGS\n#task_categories-automatic-speech-recognition #task_categories-conversational #size_categories-10K<n<100K #language-Spanish #doi-10.57967/hf/1564 #region-us \n",
"# Introduction\nThe \"COSER-ASR\" Subset is a specialized extract from the \"Corpus Oral y Sonoro del Español Rural\" (COSER; Fernández-Ordóñez 2005-present), meaning the \"Audible Corpus of Spoken Rural Spanish\". This dataset has been specifically curated to facilitate the fine-tuning of Whisper, an automatic speech recognition system. For this purpose, audio and text segments ranging from 3 to 30 seconds have been automatically extracted from the COSER corpus. These segments provide concise and diverse samples of spoken rural Spanish, ideal for training and refining speech recognition models. To ensure manageability and efficient processing, a maximum of 1024 tokens were used in the dataset, striking a balance between comprehensive coverage and computational efficiency.",
"# Content and Demographic Focus\nThe original COSER dataset includes 218 transcriptions of semi-structured interviews primarily with elderly, less-educated individuals from rural Spain. These interviews, each averaging around 54 minutes, are rich in dialectal variations and linguistic nuances, offering valuable insights into traditional Spanish dialects.",
"# Transcription Approach\nThe \"coser\" dataset provides multiple layers of transcription to cater to different linguistic and computational needs:",
"### Original Transcription (sentence): \nThis is the direct transcription of the audio segments, preserving the original speech as closely as possible and the complete original transcription.",
"### Phonological Approximation (sentence_fono): \nHere, the transcription is modified to reflect the phonological characteristics of the dialectal pronunciation. This version is crucial for understanding the phonetic nuances of rural Spanish dialects.",
"### Phonological Transcription without Discourse Markers (sentence_fono_sin_marcas): \nThis transcription removes discourse markers such as laughter, assent, etc., that are typically enclosed in square brackets. It offers a cleaner version focusing solely on the spoken words.",
"### Orthographic Correspondence (sentence_orto): \nThis layer provides the standard orthographic equivalent of the words transcribed phonologically. It bridges the gap between dialectal speech and standard Spanish orthography.",
"### Orthographic Transcription without Discourse Markers (sentence_orto_sin_marcas): \nSimilar to the phonological version without markers, this transcription provides a standard orthographic text devoid of any discourse markers. This is particularly useful for applications requiring clean text data.",
"# Limitations\n\nLimitations of this model include the fact that the time intervals in the COSER corpus are not systematically aligned, meaning that there may not be a perfect one-to-one correspondence between the audio and text data.",
"# Additional Information and Resources\nTo explore more about the COSER corpus, its methodologies, and the full range of transcriptions, visit URL and URL These resources provide an in-depth look at the COSER project, detailing its comprehensive approach to capturing the linguistic diversity of rural Spanish.",
"# References\n\nFernández-Ordóñez, I. (Ed.). (2005-present). Corpus Oral y Sonoro del Español Rural. Retrieved April 15, 2022, from URL"
] | [
61,
171,
74,
32,
39,
57,
74,
52,
66,
50,
66,
38
] | [
"passage: TAGS\n#task_categories-automatic-speech-recognition #task_categories-conversational #size_categories-10K<n<100K #language-Spanish #doi-10.57967/hf/1564 #region-us \n# Introduction\nThe \"COSER-ASR\" Subset is a specialized extract from the \"Corpus Oral y Sonoro del Español Rural\" (COSER; Fernández-Ordóñez 2005-present), meaning the \"Audible Corpus of Spoken Rural Spanish\". This dataset has been specifically curated to facilitate the fine-tuning of Whisper, an automatic speech recognition system. For this purpose, audio and text segments ranging from 3 to 30 seconds have been automatically extracted from the COSER corpus. These segments provide concise and diverse samples of spoken rural Spanish, ideal for training and refining speech recognition models. To ensure manageability and efficient processing, a maximum of 1024 tokens were used in the dataset, striking a balance between comprehensive coverage and computational efficiency.# Content and Demographic Focus\nThe original COSER dataset includes 218 transcriptions of semi-structured interviews primarily with elderly, less-educated individuals from rural Spain. These interviews, each averaging around 54 minutes, are rich in dialectal variations and linguistic nuances, offering valuable insights into traditional Spanish dialects.# Transcription Approach\nThe \"coser\" dataset provides multiple layers of transcription to cater to different linguistic and computational needs:### Original Transcription (sentence): \nThis is the direct transcription of the audio segments, preserving the original speech as closely as possible and the complete original transcription.### Phonological Approximation (sentence_fono): \nHere, the transcription is modified to reflect the phonological characteristics of the dialectal pronunciation. This version is crucial for understanding the phonetic nuances of rural Spanish dialects."
] |
2b19584aed511e1eee46009af1645f4dafc6fb92 |
# Dataset Card for Evaluation run of cookinai/Valkyrie-V1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cookinai/Valkyrie-V1](https://huggingface.co/cookinai/Valkyrie-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cookinai__Valkyrie-V1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T01:47:44.529277](https://huggingface.co/datasets/open-llm-leaderboard/details_cookinai__Valkyrie-V1/blob/main/results_2023-12-30T01-47-44.529277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6522868637521901,
"acc_stderr": 0.032133209567569515,
"acc_norm": 0.6522561915341794,
"acc_norm_stderr": 0.03280005450279144,
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.603958710710944,
"mc2_stderr": 0.01501017049153533
},
"harness|arc:challenge|25": {
"acc": 0.6424914675767918,
"acc_stderr": 0.014005494275916573,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6709818761202948,
"acc_stderr": 0.004688963175758131,
"acc_norm": 0.8626767576180043,
"acc_norm_stderr": 0.003434848525388187
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6897435897435897,
"acc_stderr": 0.02345467488940429,
"acc_norm": 0.6897435897435897,
"acc_norm_stderr": 0.02345467488940429
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323792,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323792
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297117,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083133,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.603958710710944,
"mc2_stderr": 0.01501017049153533
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337692
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cookinai__Valkyrie-V1 | [
"region:us"
] | 2023-12-30T01:50:01+00:00 | {"pretty_name": "Evaluation run of cookinai/Valkyrie-V1", "dataset_summary": "Dataset automatically created during the evaluation run of model [cookinai/Valkyrie-V1](https://huggingface.co/cookinai/Valkyrie-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cookinai__Valkyrie-V1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T01:47:44.529277](https://huggingface.co/datasets/open-llm-leaderboard/details_cookinai__Valkyrie-V1/blob/main/results_2023-12-30T01-47-44.529277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522868637521901,\n \"acc_stderr\": 0.032133209567569515,\n \"acc_norm\": 0.6522561915341794,\n \"acc_norm_stderr\": 0.03280005450279144,\n \"mc1\": 0.43451652386780903,\n \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.603958710710944,\n \"mc2_stderr\": 0.01501017049153533\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6424914675767918,\n \"acc_stderr\": 0.014005494275916573,\n \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6709818761202948,\n \"acc_stderr\": 0.004688963175758131,\n \"acc_norm\": 0.8626767576180043,\n \"acc_norm_stderr\": 0.003434848525388187\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.02345467488940429,\n \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.02345467488940429\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.013265346261323792,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.013265346261323792\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083133,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083133\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43451652386780903,\n \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.603958710710944,\n \"mc2_stderr\": 0.01501017049153533\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \"acc_stderr\": 0.012454841668337692\n }\n}\n```", "repo_url": "https://huggingface.co/cookinai/Valkyrie-V1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|arc:challenge|25_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|gsm8k|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hellaswag|10_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T01-47-44.529277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["**/details_harness|winogrande|5_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T01-47-44.529277.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T01_47_44.529277", "path": ["results_2023-12-30T01-47-44.529277.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T01-47-44.529277.parquet"]}]}]} | 2023-12-30T01:50:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cookinai/Valkyrie-V1
Dataset automatically created during the evaluation run of model cookinai/Valkyrie-V1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T01:47:44.529277(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cookinai/Valkyrie-V1\n\n\n\nDataset automatically created during the evaluation run of model cookinai/Valkyrie-V1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T01:47:44.529277(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cookinai/Valkyrie-V1\n\n\n\nDataset automatically created during the evaluation run of model cookinai/Valkyrie-V1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T01:47:44.529277(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
179,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of cookinai/Valkyrie-V1\n\n\n\nDataset automatically created during the evaluation run of model cookinai/Valkyrie-V1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T01:47:44.529277(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
43ad26640d2707780050507e573f430ac5f6b336 |
# Dataset Card for Evaluation run of jikaixuan/test_model
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jikaixuan/test_model](https://huggingface.co/jikaixuan/test_model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jikaixuan__test_model",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T01:50:31.139898](https://huggingface.co/datasets/open-llm-leaderboard/details_jikaixuan__test_model/blob/main/results_2023-12-30T01-50-31.139898.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6083156130977855,
"acc_stderr": 0.03325697047846345,
"acc_norm": 0.6149057049790325,
"acc_norm_stderr": 0.03395674600613233,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.575115982039762,
"mc2_stderr": 0.015744867615337492
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.01435639941800912,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192594
},
"harness|hellaswag|10": {
"acc": 0.6490738896634136,
"acc_stderr": 0.004762844770909858,
"acc_norm": 0.8441545508862777,
"acc_norm_stderr": 0.003619674864035017
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646775,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646775
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454805,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454805
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045803,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588674,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588674
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965835,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965835
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.01492744710193715,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.01492744710193715
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242826,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242826
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882117,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4315514993481095,
"acc_stderr": 0.01265000799946388,
"acc_norm": 0.4315514993481095,
"acc_norm_stderr": 0.01265000799946388
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.019659922493623343,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.019659922493623343
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.575115982039762,
"mc2_stderr": 0.015744867615337492
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.2721758908263836,
"acc_stderr": 0.012259714035164548
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jikaixuan__test_model | [
"region:us"
] | 2023-12-30T01:52:46+00:00 | {"pretty_name": "Evaluation run of jikaixuan/test_model", "dataset_summary": "Dataset automatically created during the evaluation run of model [jikaixuan/test_model](https://huggingface.co/jikaixuan/test_model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jikaixuan__test_model\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T01:50:31.139898](https://huggingface.co/datasets/open-llm-leaderboard/details_jikaixuan__test_model/blob/main/results_2023-12-30T01-50-31.139898.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6083156130977855,\n \"acc_stderr\": 0.03325697047846345,\n \"acc_norm\": 0.6149057049790325,\n \"acc_norm_stderr\": 0.03395674600613233,\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.575115982039762,\n \"mc2_stderr\": 0.015744867615337492\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.01435639941800912,\n \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192594\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6490738896634136,\n \"acc_stderr\": 0.004762844770909858,\n \"acc_norm\": 0.8441545508862777,\n \"acc_norm_stderr\": 0.003619674864035017\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646775,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646775\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454805,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454805\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588674,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588674\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965835,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965835\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.01492744710193715,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.01492744710193715\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242826,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242826\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882117,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882117\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n \"acc_stderr\": 0.01265000799946388,\n \"acc_norm\": 0.4315514993481095,\n \"acc_norm_stderr\": 0.01265000799946388\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.019659922493623343,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.019659922493623343\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.575115982039762,\n \"mc2_stderr\": 0.015744867615337492\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2721758908263836,\n \"acc_stderr\": 0.012259714035164548\n }\n}\n```", "repo_url": "https://huggingface.co/jikaixuan/test_model", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|arc:challenge|25_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|gsm8k|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hellaswag|10_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T01-50-31.139898.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["**/details_harness|winogrande|5_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T01-50-31.139898.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T01_50_31.139898", "path": ["results_2023-12-30T01-50-31.139898.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T01-50-31.139898.parquet"]}]}]} | 2023-12-30T01:53:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jikaixuan/test_model
Dataset automatically created during the evaluation run of model jikaixuan/test_model on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T01:50:31.139898(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jikaixuan/test_model\n\n\n\nDataset automatically created during the evaluation run of model jikaixuan/test_model on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T01:50:31.139898(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jikaixuan/test_model\n\n\n\nDataset automatically created during the evaluation run of model jikaixuan/test_model on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T01:50:31.139898(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
175,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jikaixuan/test_model\n\n\n\nDataset automatically created during the evaluation run of model jikaixuan/test_model on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T01:50:31.139898(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
01a655588c21fb0362831b8869e8e30be4b0769c |
# Dataset Card for Evaluation run of argilla/notux-8x7b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [argilla/notux-8x7b-v1](https://huggingface.co/argilla/notux-8x7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_argilla__notux-8x7b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:01:22.166565](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__notux-8x7b-v1/blob/main/results_2023-12-30T02-01-22.166565.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.712779083641632,
"acc_stderr": 0.03016532718836575,
"acc_norm": 0.7165321575751505,
"acc_norm_stderr": 0.03074418313814817,
"mc1": 0.5079559363525091,
"mc1_stderr": 0.01750128507455182,
"mc2": 0.6620853259521383,
"mc2_stderr": 0.014959983983065115
},
"harness|arc:challenge|25": {
"acc": 0.6800341296928327,
"acc_stderr": 0.013631345807016191,
"acc_norm": 0.7064846416382252,
"acc_norm_stderr": 0.01330725044494111
},
"harness|hellaswag|10": {
"acc": 0.6893049193387771,
"acc_stderr": 0.0046183239595130356,
"acc_norm": 0.8772156940848437,
"acc_norm_stderr": 0.0032751873107858395
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.032790004063100495,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.032790004063100495
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.032147373020294696,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.032147373020294696
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.03078373675774564,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.03078373675774564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.04598188057816542,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.04598188057816542
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130733,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130733
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8516129032258064,
"acc_stderr": 0.020222737554330378,
"acc_norm": 0.8516129032258064,
"acc_norm_stderr": 0.020222737554330378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.01438543285747646,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.01438543285747646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.02350757902064536,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.02350757902064536
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8067226890756303,
"acc_stderr": 0.025649470265889183,
"acc_norm": 0.8067226890756303,
"acc_norm_stderr": 0.025649470265889183
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8770642201834863,
"acc_stderr": 0.014078467983673381,
"acc_norm": 0.8770642201834863,
"acc_norm_stderr": 0.014078467983673381
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568617,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.023094329582595694,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.023094329582595694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.02856807946471428,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.02856807946471428
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462469,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628123,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628123
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.035865947385739734,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.035865947385739734
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436193,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436193
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8786717752234994,
"acc_stderr": 0.01167591388390672,
"acc_norm": 0.8786717752234994,
"acc_norm_stderr": 0.01167591388390672
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4592178770949721,
"acc_stderr": 0.016666783616525776,
"acc_norm": 0.4592178770949721,
"acc_norm_stderr": 0.016666783616525776
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.021668400256514266,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.021668400256514266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7909967845659164,
"acc_stderr": 0.023093140398374224,
"acc_norm": 0.7909967845659164,
"acc_norm_stderr": 0.023093140398374224
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.020581466138257138,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.020581466138257138
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766002,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766002
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5449804432855281,
"acc_stderr": 0.012718456618701787,
"acc_norm": 0.5449804432855281,
"acc_norm_stderr": 0.012718456618701787
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7904411764705882,
"acc_stderr": 0.02472311040767707,
"acc_norm": 0.7904411764705882,
"acc_norm_stderr": 0.02472311040767707
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.017077373377856926,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.017077373377856926
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.02464806896136615,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.02464806896136615
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5079559363525091,
"mc1_stderr": 0.01750128507455182,
"mc2": 0.6620853259521383,
"mc2_stderr": 0.014959983983065115
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491899
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.01342838248127424
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_argilla__notux-8x7b-v1 | [
"region:us"
] | 2023-12-30T02:03:35+00:00 | {"pretty_name": "Evaluation run of argilla/notux-8x7b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [argilla/notux-8x7b-v1](https://huggingface.co/argilla/notux-8x7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_argilla__notux-8x7b-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T02:01:22.166565](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__notux-8x7b-v1/blob/main/results_2023-12-30T02-01-22.166565.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.712779083641632,\n \"acc_stderr\": 0.03016532718836575,\n \"acc_norm\": 0.7165321575751505,\n \"acc_norm_stderr\": 0.03074418313814817,\n \"mc1\": 0.5079559363525091,\n \"mc1_stderr\": 0.01750128507455182,\n \"mc2\": 0.6620853259521383,\n \"mc2_stderr\": 0.014959983983065115\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6800341296928327,\n \"acc_stderr\": 0.013631345807016191,\n \"acc_norm\": 0.7064846416382252,\n \"acc_norm_stderr\": 0.01330725044494111\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6893049193387771,\n \"acc_stderr\": 0.0046183239595130356,\n \"acc_norm\": 0.8772156940848437,\n \"acc_norm_stderr\": 0.0032751873107858395\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.032790004063100495,\n \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.032790004063100495\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.032147373020294696,\n \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.032147373020294696\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.03078373675774564,\n \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.03078373675774564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.04598188057816542,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.04598188057816542\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130733,\n \"acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130733\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8516129032258064,\n \"acc_stderr\": 0.020222737554330378,\n \"acc_norm\": 0.8516129032258064,\n \"acc_norm_stderr\": 0.020222737554330378\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.034139638059062345,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.034139638059062345\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.01438543285747646,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.01438543285747646\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.02350757902064536,\n \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.02350757902064536\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8067226890756303,\n \"acc_stderr\": 0.025649470265889183,\n \"acc_norm\": 0.8067226890756303,\n \"acc_norm_stderr\": 0.025649470265889183\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8770642201834863,\n \"acc_stderr\": 0.014078467983673381,\n \"acc_norm\": 0.8770642201834863,\n \"acc_norm_stderr\": 0.014078467983673381\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568617,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568617\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8523206751054853,\n \"acc_stderr\": 0.023094329582595694,\n \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.023094329582595694\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n \"acc_stderr\": 0.02856807946471428,\n \"acc_norm\": 0.7623318385650224,\n \"acc_norm_stderr\": 0.02856807946471428\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462469,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.035865947385739734,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.035865947385739734\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.017456987872436193,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.017456987872436193\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8786717752234994,\n \"acc_stderr\": 0.01167591388390672,\n \"acc_norm\": 0.8786717752234994,\n \"acc_norm_stderr\": 0.01167591388390672\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4592178770949721,\n \"acc_stderr\": 0.016666783616525776,\n \"acc_norm\": 0.4592178770949721,\n \"acc_norm_stderr\": 0.016666783616525776\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.021668400256514266,\n \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.021668400256514266\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7909967845659164,\n \"acc_stderr\": 0.023093140398374224,\n \"acc_norm\": 0.7909967845659164,\n \"acc_norm_stderr\": 0.023093140398374224\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.020581466138257138,\n \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.020581466138257138\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766002,\n \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766002\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5449804432855281,\n \"acc_stderr\": 0.012718456618701787,\n \"acc_norm\": 0.5449804432855281,\n \"acc_norm_stderr\": 0.012718456618701787\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7904411764705882,\n \"acc_stderr\": 0.02472311040767707,\n \"acc_norm\": 0.7904411764705882,\n \"acc_norm_stderr\": 0.02472311040767707\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.017077373377856926,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.017077373377856926\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.02653704531214529,\n \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.02653704531214529\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136615,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136615\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5079559363525091,\n \"mc1_stderr\": 0.01750128507455182,\n \"mc2\": 0.6620853259521383,\n \"mc2_stderr\": 0.014959983983065115\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491899\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \"acc_stderr\": 0.01342838248127424\n }\n}\n```", "repo_url": "https://huggingface.co/argilla/notux-8x7b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-01-22.166565.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["**/details_harness|winogrande|5_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T02-01-22.166565.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T02_01_22.166565", "path": ["results_2023-12-30T02-01-22.166565.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T02-01-22.166565.parquet"]}]}]} | 2023-12-30T02:04:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of argilla/notux-8x7b-v1
Dataset automatically created during the evaluation run of model argilla/notux-8x7b-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T02:01:22.166565(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of argilla/notux-8x7b-v1\n\n\n\nDataset automatically created during the evaluation run of model argilla/notux-8x7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:01:22.166565(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of argilla/notux-8x7b-v1\n\n\n\nDataset automatically created during the evaluation run of model argilla/notux-8x7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:01:22.166565(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of argilla/notux-8x7b-v1\n\n\n\nDataset automatically created during the evaluation run of model argilla/notux-8x7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T02:01:22.166565(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
9307c0559667dbe82617567da73b9fccecc8ad32 |
# Dataset Card for Evaluation run of Locutusque/Orca-2-13b-SFT-v6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/Orca-2-13b-SFT-v6](https://huggingface.co/Locutusque/Orca-2-13b-SFT-v6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT-v6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:03:43.380204](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT-v6/blob/main/results_2023-12-30T02-03-43.380204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5890270904640104,
"acc_stderr": 0.03291493635145001,
"acc_norm": 0.5988157276074748,
"acc_norm_stderr": 0.033710582507890004,
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142985,
"mc2": 0.5400874549545076,
"mc2_stderr": 0.015468319271968397
},
"harness|arc:challenge|25": {
"acc": 0.5622866894197952,
"acc_stderr": 0.01449757388110829,
"acc_norm": 0.6040955631399317,
"acc_norm_stderr": 0.014291228393536585
},
"harness|hellaswag|10": {
"acc": 0.6218880701055567,
"acc_stderr": 0.004839247332606039,
"acc_norm": 0.8046205935072694,
"acc_norm_stderr": 0.003956821705018451
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810535,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810535
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35978835978835977,
"acc_stderr": 0.024718075944129288,
"acc_norm": 0.35978835978835977,
"acc_norm_stderr": 0.024718075944129288
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.026148685930671742,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.026148685930671742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365907,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365907
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723875,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.02500732988246122,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.02500732988246122
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815642,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815642
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.01765871059444313,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.01765871059444313
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070415,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070415
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690876,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690876
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.015852002449862103,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.015852002449862103
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281406,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281406
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291488,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291488
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42633637548891784,
"acc_stderr": 0.012630884771599698,
"acc_norm": 0.42633637548891784,
"acc_norm_stderr": 0.012630884771599698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.030042615832714864,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.030042615832714864
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.019808281317449838,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.019808281317449838
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142985,
"mc2": 0.5400874549545076,
"mc2_stderr": 0.015468319271968397
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902542
},
"harness|gsm8k|5": {
"acc": 0.05079605761940864,
"acc_stderr": 0.006048352096878093
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT-v6 | [
"region:us"
] | 2023-12-30T02:06:02+00:00 | {"pretty_name": "Evaluation run of Locutusque/Orca-2-13b-SFT-v6", "dataset_summary": "Dataset automatically created during the evaluation run of model [Locutusque/Orca-2-13b-SFT-v6](https://huggingface.co/Locutusque/Orca-2-13b-SFT-v6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT-v6\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T02:03:43.380204](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT-v6/blob/main/results_2023-12-30T02-03-43.380204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5890270904640104,\n \"acc_stderr\": 0.03291493635145001,\n \"acc_norm\": 0.5988157276074748,\n \"acc_norm_stderr\": 0.033710582507890004,\n \"mc1\": 0.379436964504284,\n \"mc1_stderr\": 0.016987039266142985,\n \"mc2\": 0.5400874549545076,\n \"mc2_stderr\": 0.015468319271968397\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5622866894197952,\n \"acc_stderr\": 0.01449757388110829,\n \"acc_norm\": 0.6040955631399317,\n \"acc_norm_stderr\": 0.014291228393536585\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6218880701055567,\n \"acc_stderr\": 0.004839247332606039,\n \"acc_norm\": 0.8046205935072694,\n \"acc_norm_stderr\": 0.003956821705018451\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810535,\n \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810535\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936336,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936336\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35978835978835977,\n \"acc_stderr\": 0.024718075944129288,\n \"acc_norm\": 0.35978835978835977,\n \"acc_norm_stderr\": 0.024718075944129288\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n \"acc_stderr\": 0.026148685930671742,\n \"acc_norm\": 0.6967741935483871,\n \"acc_norm_stderr\": 0.026148685930671742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365907,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365907\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723875,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723875\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.02500732988246122,\n \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.02500732988246122\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815642,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815642\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7834862385321101,\n \"acc_stderr\": 0.01765871059444313,\n \"acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.01765871059444313\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070415,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070415\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690876,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690876\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n \"acc_stderr\": 0.015852002449862103,\n \"acc_norm\": 0.3407821229050279,\n \"acc_norm_stderr\": 0.015852002449862103\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281406,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281406\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291488,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291488\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42633637548891784,\n \"acc_stderr\": 0.012630884771599698,\n \"acc_norm\": 0.42633637548891784,\n \"acc_norm_stderr\": 0.012630884771599698\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.030042615832714864,\n \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.030042615832714864\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.019808281317449838,\n \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.019808281317449838\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.379436964504284,\n \"mc1_stderr\": 0.016987039266142985,\n \"mc2\": 0.5400874549545076,\n \"mc2_stderr\": 0.015468319271968397\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902542\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05079605761940864,\n \"acc_stderr\": 0.006048352096878093\n }\n}\n```", "repo_url": "https://huggingface.co/Locutusque/Orca-2-13b-SFT-v6", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-03-43.380204.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["**/details_harness|winogrande|5_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T02-03-43.380204.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T02_03_43.380204", "path": ["results_2023-12-30T02-03-43.380204.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T02-03-43.380204.parquet"]}]}]} | 2023-12-30T02:06:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Locutusque/Orca-2-13b-SFT-v6
Dataset automatically created during the evaluation run of model Locutusque/Orca-2-13b-SFT-v6 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T02:03:43.380204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Locutusque/Orca-2-13b-SFT-v6\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/Orca-2-13b-SFT-v6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:03:43.380204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Locutusque/Orca-2-13b-SFT-v6\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/Orca-2-13b-SFT-v6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:03:43.380204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Locutusque/Orca-2-13b-SFT-v6\n\n\n\nDataset automatically created during the evaluation run of model Locutusque/Orca-2-13b-SFT-v6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T02:03:43.380204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
29408bc651757641b007409af382eca5dd60d83e |
# Dataset Card for Evaluation run of Walmart-the-bag/Solar-10.7B-Cato
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Walmart-the-bag/Solar-10.7B-Cato](https://huggingface.co/Walmart-the-bag/Solar-10.7B-Cato) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Walmart-the-bag__Solar-10.7B-Cato",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:07:16.124496](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Solar-10.7B-Cato/blob/main/results_2023-12-30T02-07-16.124496.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6601184986157275,
"acc_stderr": 0.03173344410424321,
"acc_norm": 0.6615926738267002,
"acc_norm_stderr": 0.03237146731014547,
"mc1": 0.4700122399020808,
"mc1_stderr": 0.017471992091697544,
"mc2": 0.6168232864590555,
"mc2_stderr": 0.015630771495356736
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175452,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.01355267154362349
},
"harness|hellaswag|10": {
"acc": 0.6845249950209121,
"acc_stderr": 0.0046375504780073636,
"acc_norm": 0.8615813582951604,
"acc_norm_stderr": 0.0034463307489637123
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137282,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137282
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.02573364199183898,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.02573364199183898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.02366435940288023,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.02366435940288023
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.02415222596280158,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.02415222596280158
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657567,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3229050279329609,
"acc_stderr": 0.015638440380241484,
"acc_norm": 0.3229050279329609,
"acc_norm_stderr": 0.015638440380241484
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48239895697522817,
"acc_stderr": 0.012762321298823646,
"acc_norm": 0.48239895697522817,
"acc_norm_stderr": 0.012762321298823646
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174923,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174923
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4700122399020808,
"mc1_stderr": 0.017471992091697544,
"mc2": 0.6168232864590555,
"mc2_stderr": 0.015630771495356736
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435088
},
"harness|gsm8k|5": {
"acc": 0.6459438968915845,
"acc_stderr": 0.013172728385222576
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Walmart-the-bag__Solar-10.7B-Cato | [
"region:us"
] | 2023-12-30T02:09:28+00:00 | {"pretty_name": "Evaluation run of Walmart-the-bag/Solar-10.7B-Cato", "dataset_summary": "Dataset automatically created during the evaluation run of model [Walmart-the-bag/Solar-10.7B-Cato](https://huggingface.co/Walmart-the-bag/Solar-10.7B-Cato) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Walmart-the-bag__Solar-10.7B-Cato\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T02:07:16.124496](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Solar-10.7B-Cato/blob/main/results_2023-12-30T02-07-16.124496.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6601184986157275,\n \"acc_stderr\": 0.03173344410424321,\n \"acc_norm\": 0.6615926738267002,\n \"acc_norm_stderr\": 0.03237146731014547,\n \"mc1\": 0.4700122399020808,\n \"mc1_stderr\": 0.017471992091697544,\n \"mc2\": 0.6168232864590555,\n \"mc2_stderr\": 0.015630771495356736\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175452,\n \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.01355267154362349\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6845249950209121,\n \"acc_stderr\": 0.0046375504780073636,\n \"acc_norm\": 0.8615813582951604,\n \"acc_norm_stderr\": 0.0034463307489637123\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137282,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137282\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835363,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835363\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.02573364199183898,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.02573364199183898\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.02415222596280158,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.02415222596280158\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3229050279329609,\n \"acc_stderr\": 0.015638440380241484,\n \"acc_norm\": 0.3229050279329609,\n \"acc_norm_stderr\": 0.015638440380241484\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912258,\n \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912258\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48239895697522817,\n \"acc_stderr\": 0.012762321298823646,\n \"acc_norm\": 0.48239895697522817,\n \"acc_norm_stderr\": 0.012762321298823646\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789527,\n \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789527\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174923,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174923\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4700122399020808,\n \"mc1_stderr\": 0.017471992091697544,\n \"mc2\": 0.6168232864590555,\n \"mc2_stderr\": 0.015630771495356736\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435088\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6459438968915845,\n \"acc_stderr\": 0.013172728385222576\n }\n}\n```", "repo_url": "https://huggingface.co/Walmart-the-bag/Solar-10.7B-Cato", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-07-16.124496.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["**/details_harness|winogrande|5_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T02-07-16.124496.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T02_07_16.124496", "path": ["results_2023-12-30T02-07-16.124496.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T02-07-16.124496.parquet"]}]}]} | 2023-12-30T02:09:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Walmart-the-bag/Solar-10.7B-Cato
Dataset automatically created during the evaluation run of model Walmart-the-bag/Solar-10.7B-Cato on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T02:07:16.124496(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Walmart-the-bag/Solar-10.7B-Cato\n\n\n\nDataset automatically created during the evaluation run of model Walmart-the-bag/Solar-10.7B-Cato on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:07:16.124496(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Walmart-the-bag/Solar-10.7B-Cato\n\n\n\nDataset automatically created during the evaluation run of model Walmart-the-bag/Solar-10.7B-Cato on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:07:16.124496(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Walmart-the-bag/Solar-10.7B-Cato\n\n\n\nDataset automatically created during the evaluation run of model Walmart-the-bag/Solar-10.7B-Cato on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T02:07:16.124496(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
bde39d54143dca4c7d65b05259f03a9033db4909 |
# Dataset Card for Evaluation run of smelborp/MixtralOrochi8x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [smelborp/MixtralOrochi8x7B](https://huggingface.co/smelborp/MixtralOrochi8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:24:04.286101](https://huggingface.co/datasets/open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B/blob/main/results_2023-12-30T02-24-04.286101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6933650457336848,
"acc_stderr": 0.030427906094644037,
"acc_norm": 0.7040806942153237,
"acc_norm_stderr": 0.03106416581410797,
"mc1": 0.46266829865361075,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6399397085586839,
"mc2_stderr": 0.015220747814252549
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441379,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725225
},
"harness|hellaswag|10": {
"acc": 0.6814379605656243,
"acc_stderr": 0.00464966527389064,
"acc_norm": 0.8609838677554272,
"acc_norm_stderr": 0.0034525630964691227
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742399,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7697368421052632,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.7697368421052632,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.02512576648482785,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.02512576648482785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565656,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565656
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.046774730044912,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.046774730044912
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5211640211640212,
"acc_stderr": 0.025728230952130723,
"acc_norm": 0.5211640211640212,
"acc_norm_stderr": 0.025728230952130723
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.020923327006423298,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.020923327006423298
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822032,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822032
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603918,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.025859164122051453,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.025859164122051453
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.013895729292588957,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.013895729292588957
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.023405530480846315,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.023405530480846315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.02838039114709471,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.02838039114709471
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622793,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622793
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628123,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628123
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562586,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562586
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8722860791826309,
"acc_stderr": 0.011935626313999874,
"acc_norm": 0.8722860791826309,
"acc_norm_stderr": 0.011935626313999874
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.464804469273743,
"acc_stderr": 0.016681020931076655,
"acc_norm": 0.464804469273743,
"acc_norm_stderr": 0.016681020931076655
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.0231527224394023,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.0231527224394023
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.021751866060815875,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.021751866060815875
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5212765957446809,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.5212765957446809,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5260756192959583,
"acc_stderr": 0.012752858346533143,
"acc_norm": 0.5260756192959583,
"acc_norm_stderr": 0.012752858346533143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.025187786660227255,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.025187786660227255
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7369281045751634,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.7369281045751634,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072878,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072878
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46266829865361075,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6399397085586839,
"mc2_stderr": 0.015220747814252549
},
"harness|winogrande|5": {
"acc": 0.7987371744277821,
"acc_stderr": 0.011268519971577682
},
"harness|gsm8k|5": {
"acc": 0.1728582259287339,
"acc_stderr": 0.010415432246200583
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B | [
"region:us"
] | 2023-12-30T02:26:28+00:00 | {"pretty_name": "Evaluation run of smelborp/MixtralOrochi8x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [smelborp/MixtralOrochi8x7B](https://huggingface.co/smelborp/MixtralOrochi8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T02:24:04.286101](https://huggingface.co/datasets/open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B/blob/main/results_2023-12-30T02-24-04.286101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6933650457336848,\n \"acc_stderr\": 0.030427906094644037,\n \"acc_norm\": 0.7040806942153237,\n \"acc_norm_stderr\": 0.03106416581410797,\n \"mc1\": 0.46266829865361075,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6399397085586839,\n \"mc2_stderr\": 0.015220747814252549\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441379,\n \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725225\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6814379605656243,\n \"acc_stderr\": 0.00464966527389064,\n \"acc_norm\": 0.8609838677554272,\n \"acc_norm_stderr\": 0.0034525630964691227\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742399,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7697368421052632,\n \"acc_stderr\": 0.03426059424403165,\n \"acc_norm\": 0.7697368421052632,\n \"acc_norm_stderr\": 0.03426059424403165\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.02512576648482785,\n \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.02512576648482785\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n \"acc_stderr\": 0.030085743248565656,\n \"acc_norm\": 0.8472222222222222,\n \"acc_norm_stderr\": 0.030085743248565656\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7021276595744681,\n \"acc_stderr\": 0.02989614568209546,\n \"acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.02989614568209546\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.046774730044912,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.046774730044912\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5211640211640212,\n \"acc_stderr\": 0.025728230952130723,\n \"acc_norm\": 0.5211640211640212,\n \"acc_norm_stderr\": 0.025728230952130723\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8387096774193549,\n \"acc_stderr\": 0.020923327006423298,\n \"acc_norm\": 0.8387096774193549,\n \"acc_norm_stderr\": 0.020923327006423298\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822032,\n \"acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822032\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603918,\n \"acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603918\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.025859164122051453,\n \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.025859164122051453\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8807339449541285,\n \"acc_stderr\": 0.013895729292588957,\n \"acc_norm\": 0.8807339449541285,\n \"acc_norm_stderr\": 0.013895729292588957\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8725490196078431,\n \"acc_stderr\": 0.023405530480846315,\n \"acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.023405530480846315\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n \"acc_stderr\": 0.02838039114709471,\n \"acc_norm\": 0.7668161434977578,\n \"acc_norm_stderr\": 0.02838039114709471\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622793,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622793\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.01831589168562586,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.01831589168562586\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8722860791826309,\n \"acc_stderr\": 0.011935626313999874,\n \"acc_norm\": 0.8722860791826309,\n \"acc_norm_stderr\": 0.011935626313999874\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.464804469273743,\n \"acc_stderr\": 0.016681020931076655,\n \"acc_norm\": 0.464804469273743,\n \"acc_norm_stderr\": 0.016681020931076655\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.0231527224394023,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.0231527224394023\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.021751866060815875,\n \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.021751866060815875\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5212765957446809,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.5212765957446809,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5260756192959583,\n \"acc_stderr\": 0.012752858346533143,\n \"acc_norm\": 0.5260756192959583,\n \"acc_norm_stderr\": 0.012752858346533143\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.025187786660227255,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.025187786660227255\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7369281045751634,\n \"acc_stderr\": 0.017812676542320657,\n \"acc_norm\": 0.7369281045751634,\n \"acc_norm_stderr\": 0.017812676542320657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072878,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072878\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46266829865361075,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6399397085586839,\n \"mc2_stderr\": 0.015220747814252549\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.011268519971577682\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1728582259287339,\n \"acc_stderr\": 0.010415432246200583\n }\n}\n```", "repo_url": "https://huggingface.co/smelborp/MixtralOrochi8x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-24-04.286101.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["**/details_harness|winogrande|5_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T02-24-04.286101.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T02_24_04.286101", "path": ["results_2023-12-30T02-24-04.286101.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T02-24-04.286101.parquet"]}]}]} | 2023-12-30T02:26:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of smelborp/MixtralOrochi8x7B
Dataset automatically created during the evaluation run of model smelborp/MixtralOrochi8x7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T02:24:04.286101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of smelborp/MixtralOrochi8x7B\n\n\n\nDataset automatically created during the evaluation run of model smelborp/MixtralOrochi8x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:24:04.286101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of smelborp/MixtralOrochi8x7B\n\n\n\nDataset automatically created during the evaluation run of model smelborp/MixtralOrochi8x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:24:04.286101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of smelborp/MixtralOrochi8x7B\n\n\n\nDataset automatically created during the evaluation run of model smelborp/MixtralOrochi8x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T02:24:04.286101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
b193322879089c03872f48aa78b978ce4f041f6a |
# Dataset Card for Evaluation run of jikaixuan/test_merged_model
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jikaixuan/test_merged_model](https://huggingface.co/jikaixuan/test_merged_model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jikaixuan__test_merged_model",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:33:40.705654](https://huggingface.co/datasets/open-llm-leaderboard/details_jikaixuan__test_merged_model/blob/main/results_2023-12-30T02-33-40.705654.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6345213677276633,
"acc_stderr": 0.03239882126723081,
"acc_norm": 0.640269245162976,
"acc_norm_stderr": 0.03305121705084123,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135032,
"mc2": 0.4865410177312943,
"mc2_stderr": 0.014876963942379959
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520767,
"acc_norm": 0.6160409556313993,
"acc_norm_stderr": 0.01421244498065189
},
"harness|hellaswag|10": {
"acc": 0.6276638119896435,
"acc_stderr": 0.004824393076826623,
"acc_norm": 0.831009759012149,
"acc_norm_stderr": 0.0037397742854185186
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601688,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266875,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266875
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406999,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406999
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767867,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767867
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537375,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.012715404841277738,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.012715404841277738
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.0286619962023353,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.0286619962023353
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065684,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065684
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135032,
"mc2": 0.4865410177312943,
"mc2_stderr": 0.014876963942379959
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.38968915845337376,
"acc_stderr": 0.013433123236110707
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jikaixuan__test_merged_model | [
"region:us"
] | 2023-12-30T02:35:57+00:00 | {"pretty_name": "Evaluation run of jikaixuan/test_merged_model", "dataset_summary": "Dataset automatically created during the evaluation run of model [jikaixuan/test_merged_model](https://huggingface.co/jikaixuan/test_merged_model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jikaixuan__test_merged_model\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T02:33:40.705654](https://huggingface.co/datasets/open-llm-leaderboard/details_jikaixuan__test_merged_model/blob/main/results_2023-12-30T02-33-40.705654.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6345213677276633,\n \"acc_stderr\": 0.03239882126723081,\n \"acc_norm\": 0.640269245162976,\n \"acc_norm_stderr\": 0.03305121705084123,\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.016419874731135032,\n \"mc2\": 0.4865410177312943,\n \"mc2_stderr\": 0.014876963942379959\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520767,\n \"acc_norm\": 0.6160409556313993,\n \"acc_norm_stderr\": 0.01421244498065189\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6276638119896435,\n \"acc_stderr\": 0.004824393076826623,\n \"acc_norm\": 0.831009759012149,\n \"acc_norm_stderr\": 0.0037397742854185186\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601688,\n \"acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601688\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266875,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266875\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767867,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767867\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537375,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537375\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n \"acc_stderr\": 0.012715404841277738,\n \"acc_norm\": 0.45371577574967403,\n \"acc_norm_stderr\": 0.012715404841277738\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065684,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065684\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.016419874731135032,\n \"mc2\": 0.4865410177312943,\n \"mc2_stderr\": 0.014876963942379959\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38968915845337376,\n \"acc_stderr\": 0.013433123236110707\n }\n}\n```", "repo_url": "https://huggingface.co/jikaixuan/test_merged_model", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-33-40.705654.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["**/details_harness|winogrande|5_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T02-33-40.705654.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T02_33_40.705654", "path": ["results_2023-12-30T02-33-40.705654.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T02-33-40.705654.parquet"]}]}]} | 2023-12-30T02:36:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jikaixuan/test_merged_model
Dataset automatically created during the evaluation run of model jikaixuan/test_merged_model on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T02:33:40.705654(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jikaixuan/test_merged_model\n\n\n\nDataset automatically created during the evaluation run of model jikaixuan/test_merged_model on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:33:40.705654(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jikaixuan/test_merged_model\n\n\n\nDataset automatically created during the evaluation run of model jikaixuan/test_merged_model on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:33:40.705654(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jikaixuan/test_merged_model\n\n\n\nDataset automatically created during the evaluation run of model jikaixuan/test_merged_model on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T02:33:40.705654(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
71112df7c26ea5dae45a7432c87cab5befa5986e |
# Dataset Card for Evaluation run of Azazelle/smol_bruin-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azazelle/smol_bruin-7b](https://huggingface.co/Azazelle/smol_bruin-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azazelle__smol_bruin-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:44:17.580970](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__smol_bruin-7b/blob/main/results_2023-12-30T02-44-17.580970.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6546543613108019,
"acc_stderr": 0.03193794801675513,
"acc_norm": 0.6545777310679081,
"acc_norm_stderr": 0.03259631769881577,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5564800986558189,
"mc2_stderr": 0.01563029024385339
},
"harness|arc:challenge|25": {
"acc": 0.6501706484641638,
"acc_stderr": 0.013936809212158287,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.013678810399518826
},
"harness|hellaswag|10": {
"acc": 0.6885082652857997,
"acc_stderr": 0.004621568125102051,
"acc_norm": 0.8647679745070703,
"acc_norm_stderr": 0.0034127234117275512
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250444,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250444
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579665,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579665
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508762,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508762
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4581005586592179,
"acc_stderr": 0.016663683295020534,
"acc_norm": 0.4581005586592179,
"acc_norm_stderr": 0.016663683295020534
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47783572359843546,
"acc_stderr": 0.012757683047716177,
"acc_norm": 0.47783572359843546,
"acc_norm_stderr": 0.012757683047716177
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.018690850273595284,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.018690850273595284
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5564800986558189,
"mc2_stderr": 0.01563029024385339
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019806
},
"harness|gsm8k|5": {
"acc": 0.7043214556482184,
"acc_stderr": 0.012570068947898772
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Azazelle__smol_bruin-7b | [
"region:us"
] | 2023-12-30T02:46:30+00:00 | {"pretty_name": "Evaluation run of Azazelle/smol_bruin-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Azazelle/smol_bruin-7b](https://huggingface.co/Azazelle/smol_bruin-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__smol_bruin-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T02:44:17.580970](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__smol_bruin-7b/blob/main/results_2023-12-30T02-44-17.580970.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6546543613108019,\n \"acc_stderr\": 0.03193794801675513,\n \"acc_norm\": 0.6545777310679081,\n \"acc_norm_stderr\": 0.03259631769881577,\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5564800986558189,\n \"mc2_stderr\": 0.01563029024385339\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6501706484641638,\n \"acc_stderr\": 0.013936809212158287,\n \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.013678810399518826\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6885082652857997,\n \"acc_stderr\": 0.004621568125102051,\n \"acc_norm\": 0.8647679745070703,\n \"acc_norm_stderr\": 0.0034127234117275512\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250444,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250444\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579665,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579665\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.030216831011508762,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.030216831011508762\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4581005586592179,\n \"acc_stderr\": 0.016663683295020534,\n \"acc_norm\": 0.4581005586592179,\n \"acc_norm_stderr\": 0.016663683295020534\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47783572359843546,\n \"acc_stderr\": 0.012757683047716177,\n \"acc_norm\": 0.47783572359843546,\n \"acc_norm_stderr\": 0.012757683047716177\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.018690850273595284,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.018690850273595284\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5564800986558189,\n \"mc2_stderr\": 0.01563029024385339\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019806\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7043214556482184,\n \"acc_stderr\": 0.012570068947898772\n }\n}\n```", "repo_url": "https://huggingface.co/Azazelle/smol_bruin-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-44-17.580970.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["**/details_harness|winogrande|5_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T02-44-17.580970.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T02_44_17.580970", "path": ["results_2023-12-30T02-44-17.580970.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T02-44-17.580970.parquet"]}]}]} | 2023-12-30T02:46:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Azazelle/smol_bruin-7b
Dataset automatically created during the evaluation run of model Azazelle/smol_bruin-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T02:44:17.580970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Azazelle/smol_bruin-7b\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/smol_bruin-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:44:17.580970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Azazelle/smol_bruin-7b\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/smol_bruin-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:44:17.580970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Azazelle/smol_bruin-7b\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/smol_bruin-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T02:44:17.580970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
bcd9799e2814c1406cf6f2941194aa3373c9cc76 |
# Dataset Card for Evaluation run of Azazelle/xDAN-SlimOrca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azazelle/xDAN-SlimOrca](https://huggingface.co/Azazelle/xDAN-SlimOrca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azazelle__xDAN-SlimOrca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:47:16.082570](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__xDAN-SlimOrca/blob/main/results_2023-12-30T02-47-16.082570.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6385100804568992,
"acc_stderr": 0.03217117249906078,
"acc_norm": 0.6407471767083691,
"acc_norm_stderr": 0.03280870726203415,
"mc1": 0.408812729498164,
"mc1_stderr": 0.01720995215164173,
"mc2": 0.5768241806655555,
"mc2_stderr": 0.015542058188975288
},
"harness|arc:challenge|25": {
"acc": 0.628839590443686,
"acc_stderr": 0.014117971901142818,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156213
},
"harness|hellaswag|10": {
"acc": 0.6734714200358495,
"acc_stderr": 0.004679847503411344,
"acc_norm": 0.8570005974905397,
"acc_norm_stderr": 0.00349356791409329
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723285,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723285
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372174,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010358,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010358
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240647,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240647
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7219730941704036,
"acc_stderr": 0.030069584874494036,
"acc_norm": 0.7219730941704036,
"acc_norm_stderr": 0.030069584874494036
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922526,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468355,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468355
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49478487614080835,
"acc_stderr": 0.012769541449652547,
"acc_norm": 0.49478487614080835,
"acc_norm_stderr": 0.012769541449652547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.02768297952296022,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.02768297952296022
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.408812729498164,
"mc1_stderr": 0.01720995215164173,
"mc2": 0.5768241806655555,
"mc2_stderr": 0.015542058188975288
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.0117056975652052
},
"harness|gsm8k|5": {
"acc": 0.579226686884003,
"acc_stderr": 0.013598489497182838
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Azazelle__xDAN-SlimOrca | [
"region:us"
] | 2023-12-30T02:49:32+00:00 | {"pretty_name": "Evaluation run of Azazelle/xDAN-SlimOrca", "dataset_summary": "Dataset automatically created during the evaluation run of model [Azazelle/xDAN-SlimOrca](https://huggingface.co/Azazelle/xDAN-SlimOrca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__xDAN-SlimOrca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T02:47:16.082570](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__xDAN-SlimOrca/blob/main/results_2023-12-30T02-47-16.082570.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6385100804568992,\n \"acc_stderr\": 0.03217117249906078,\n \"acc_norm\": 0.6407471767083691,\n \"acc_norm_stderr\": 0.03280870726203415,\n \"mc1\": 0.408812729498164,\n \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5768241806655555,\n \"mc2_stderr\": 0.015542058188975288\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.628839590443686,\n \"acc_stderr\": 0.014117971901142818,\n \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156213\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6734714200358495,\n \"acc_stderr\": 0.004679847503411344,\n \"acc_norm\": 0.8570005974905397,\n \"acc_norm_stderr\": 0.00349356791409329\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723285,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723285\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509476,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509476\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372174,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372174\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010358,\n \"acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010358\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240647,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240647\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7219730941704036,\n \"acc_stderr\": 0.030069584874494036,\n \"acc_norm\": 0.7219730941704036,\n \"acc_norm_stderr\": 0.030069584874494036\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n \"acc_stderr\": 0.013853724170922526,\n \"acc_norm\": 0.8160919540229885,\n \"acc_norm_stderr\": 0.013853724170922526\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468355,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468355\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713002,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713002\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49478487614080835,\n \"acc_stderr\": 0.012769541449652547,\n \"acc_norm\": 0.49478487614080835,\n \"acc_norm_stderr\": 0.012769541449652547\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.02768297952296022,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.02768297952296022\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.408812729498164,\n \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5768241806655555,\n \"mc2_stderr\": 0.015542058188975288\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.0117056975652052\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.579226686884003,\n \"acc_stderr\": 0.013598489497182838\n }\n}\n```", "repo_url": "https://huggingface.co/Azazelle/xDAN-SlimOrca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-47-16.082570.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["**/details_harness|winogrande|5_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T02-47-16.082570.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T02_47_16.082570", "path": ["results_2023-12-30T02-47-16.082570.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T02-47-16.082570.parquet"]}]}]} | 2023-12-30T02:49:53+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Azazelle/xDAN-SlimOrca
Dataset automatically created during the evaluation run of model Azazelle/xDAN-SlimOrca on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T02:47:16.082570(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Azazelle/xDAN-SlimOrca\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/xDAN-SlimOrca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:47:16.082570(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Azazelle/xDAN-SlimOrca\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/xDAN-SlimOrca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:47:16.082570(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Azazelle/xDAN-SlimOrca\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/xDAN-SlimOrca on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T02:47:16.082570(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
014a560cfe56cb7e9b420fdfc1be145704e029b0 |
# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_7Bx2_MoE](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:50:07.869164](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE/blob/main/results_2023-12-30T02-50-07.869164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6544585464368745,
"acc_stderr": 0.03200088458372204,
"acc_norm": 0.6547015253813414,
"acc_norm_stderr": 0.032655083043548944,
"mc1": 0.5348837209302325,
"mc1_stderr": 0.017460849975873972,
"mc2": 0.6723305286969269,
"mc2_stderr": 0.01523375567555562
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.013592431519068079,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266129
},
"harness|hellaswag|10": {
"acc": 0.7013543118900617,
"acc_stderr": 0.004567287775700558,
"acc_norm": 0.8745269866560446,
"acc_norm_stderr": 0.003305774980082251
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.025542846817400496,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.025542846817400496
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.01517314184512625,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.01517314184512625
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.034063153607115086,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.034063153607115086
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4558659217877095,
"acc_stderr": 0.01665722942458631,
"acc_norm": 0.4558659217877095,
"acc_norm_stderr": 0.01665722942458631
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653354,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653354
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031204,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031204
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5348837209302325,
"mc1_stderr": 0.017460849975873972,
"mc2": 0.6723305286969269,
"mc2_stderr": 0.01523375567555562
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.6846095526914329,
"acc_stderr": 0.012799353675801832
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE | [
"region:us"
] | 2023-12-30T02:52:24+00:00 | {"pretty_name": "Evaluation run of cloudyu/Mixtral_7Bx2_MoE", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Mixtral_7Bx2_MoE](https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T02:50:07.869164](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Mixtral_7Bx2_MoE/blob/main/results_2023-12-30T02-50-07.869164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6544585464368745,\n \"acc_stderr\": 0.03200088458372204,\n \"acc_norm\": 0.6547015253813414,\n \"acc_norm_stderr\": 0.032655083043548944,\n \"mc1\": 0.5348837209302325,\n \"mc1_stderr\": 0.017460849975873972,\n \"mc2\": 0.6723305286969269,\n \"mc2_stderr\": 0.01523375567555562\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068079,\n \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266129\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7013543118900617,\n \"acc_stderr\": 0.004567287775700558,\n \"acc_norm\": 0.8745269866560446,\n \"acc_norm_stderr\": 0.003305774980082251\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400496,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400496\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512625,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512625\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n \"acc_stderr\": 0.01665722942458631,\n \"acc_norm\": 0.4558659217877095,\n \"acc_norm_stderr\": 0.01665722942458631\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653354,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653354\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031204,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031204\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5348837209302325,\n \"mc1_stderr\": 0.017460849975873972,\n \"mc2\": 0.6723305286969269,\n \"mc2_stderr\": 0.01523375567555562\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6846095526914329,\n \"acc_stderr\": 0.012799353675801832\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-50-07.869164.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["**/details_harness|winogrande|5_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T02-50-07.869164.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T02_50_07.869164", "path": ["results_2023-12-30T02-50-07.869164.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T02-50-07.869164.parquet"]}]}]} | 2023-12-30T02:52:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE
Dataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx2_MoE on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T02:50:07.869164(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx2_MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:50:07.869164(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx2_MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:50:07.869164(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of cloudyu/Mixtral_7Bx2_MoE\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Mixtral_7Bx2_MoE on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T02:50:07.869164(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
c3e9ed7321c5eb455da06c47b0968f174e03f407 |
# Dataset Card for Evaluation run of Azazelle/Dumb-Maidlet
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azazelle/Dumb-Maidlet](https://huggingface.co/Azazelle/Dumb-Maidlet) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azazelle__Dumb-Maidlet",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:53:46.543751](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Dumb-Maidlet/blob/main/results_2023-12-30T02-53-46.543751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.65359005649605,
"acc_stderr": 0.03188722178891031,
"acc_norm": 0.6555607504847163,
"acc_norm_stderr": 0.032524414463013254,
"mc1": 0.3378212974296206,
"mc1_stderr": 0.01655716732251688,
"mc2": 0.5070013733552167,
"mc2_stderr": 0.015178564699647542
},
"harness|arc:challenge|25": {
"acc": 0.6322525597269625,
"acc_stderr": 0.014090995618168475,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.013760988200880538
},
"harness|hellaswag|10": {
"acc": 0.6743676558454491,
"acc_stderr": 0.004676529200753,
"acc_norm": 0.8605855407289384,
"acc_norm_stderr": 0.003456706038054756
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944437,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970572,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970572
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.02934457250063434,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.02934457250063434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233497,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233497
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.01366423099583483,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.01366423099583483
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.02410571260775431,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.02410571260775431
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106606,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504517,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144717,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144717
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399683,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399683
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3378212974296206,
"mc1_stderr": 0.01655716732251688,
"mc2": 0.5070013733552167,
"mc2_stderr": 0.015178564699647542
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.011201862744487047
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.013428382481274237
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Azazelle__Dumb-Maidlet | [
"region:us"
] | 2023-12-30T02:56:02+00:00 | {"pretty_name": "Evaluation run of Azazelle/Dumb-Maidlet", "dataset_summary": "Dataset automatically created during the evaluation run of model [Azazelle/Dumb-Maidlet](https://huggingface.co/Azazelle/Dumb-Maidlet) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__Dumb-Maidlet\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T02:53:46.543751](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Dumb-Maidlet/blob/main/results_2023-12-30T02-53-46.543751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.65359005649605,\n \"acc_stderr\": 0.03188722178891031,\n \"acc_norm\": 0.6555607504847163,\n \"acc_norm_stderr\": 0.032524414463013254,\n \"mc1\": 0.3378212974296206,\n \"mc1_stderr\": 0.01655716732251688,\n \"mc2\": 0.5070013733552167,\n \"mc2_stderr\": 0.015178564699647542\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6322525597269625,\n \"acc_stderr\": 0.014090995618168475,\n \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880538\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6743676558454491,\n \"acc_stderr\": 0.004676529200753,\n \"acc_norm\": 0.8605855407289384,\n \"acc_norm_stderr\": 0.003456706038054756\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944437,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944437\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970572,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970572\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.02934457250063434,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.02934457250063434\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233497,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233497\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.01366423099583483,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.01366423099583483\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.02410571260775431,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.02410571260775431\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n \"acc_stderr\": 0.016204672385106606,\n \"acc_norm\": 0.376536312849162,\n \"acc_norm_stderr\": 0.016204672385106606\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n \"acc_stderr\": 0.012731102790504517,\n \"acc_norm\": 0.46088657105606257,\n \"acc_norm_stderr\": 0.012731102790504517\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144717,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144717\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n \"mc1_stderr\": 0.01655716732251688,\n \"mc2\": 0.5070013733552167,\n \"mc2_stderr\": 0.015178564699647542\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487047\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \"acc_stderr\": 0.013428382481274237\n }\n}\n```", "repo_url": "https://huggingface.co/Azazelle/Dumb-Maidlet", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-53-46.543751.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["**/details_harness|winogrande|5_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T02-53-46.543751.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T02_53_46.543751", "path": ["results_2023-12-30T02-53-46.543751.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T02-53-46.543751.parquet"]}]}]} | 2023-12-30T02:56:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Azazelle/Dumb-Maidlet
Dataset automatically created during the evaluation run of model Azazelle/Dumb-Maidlet on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T02:53:46.543751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Azazelle/Dumb-Maidlet\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Dumb-Maidlet on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:53:46.543751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Azazelle/Dumb-Maidlet\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Dumb-Maidlet on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:53:46.543751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Azazelle/Dumb-Maidlet\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/Dumb-Maidlet on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T02:53:46.543751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
b92292caa65ff596ce8a215161e6f43494bd7c12 |
# Dataset Card for Evaluation run of Azazelle/SlimMelodicMaid
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azazelle/SlimMelodicMaid](https://huggingface.co/Azazelle/SlimMelodicMaid) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azazelle__SlimMelodicMaid",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:54:45.572792](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__SlimMelodicMaid/blob/main/results_2023-12-30T02-54-45.572792.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6495626338556433,
"acc_stderr": 0.03193676074571155,
"acc_norm": 0.6515021650371259,
"acc_norm_stderr": 0.03257111121158258,
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332363,
"mc2": 0.6087927851947197,
"mc2_stderr": 0.015566919235032412
},
"harness|arc:challenge|25": {
"acc": 0.6424914675767918,
"acc_stderr": 0.014005494275916576,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537302
},
"harness|hellaswag|10": {
"acc": 0.6796454889464251,
"acc_stderr": 0.0046565916786067574,
"acc_norm": 0.8600876319458275,
"acc_norm_stderr": 0.0034618713240671954
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02390115797940253,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02390115797940253
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7310924369747899,
"acc_stderr": 0.02880139219363127,
"acc_norm": 0.7310924369747899,
"acc_norm_stderr": 0.02880139219363127
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038325,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038325
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834829,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834829
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3743016759776536,
"acc_stderr": 0.01618544417945717,
"acc_norm": 0.3743016759776536,
"acc_norm_stderr": 0.01618544417945717
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4869621903520209,
"acc_stderr": 0.012765893883835332,
"acc_norm": 0.4869621903520209,
"acc_norm_stderr": 0.012765893883835332
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740543,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740543
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.0193533605475537,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.0193533605475537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332363,
"mc2": 0.6087927851947197,
"mc2_stderr": 0.015566919235032412
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090254
},
"harness|gsm8k|5": {
"acc": 0.6080363912054587,
"acc_stderr": 0.013447140886023815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Azazelle__SlimMelodicMaid | [
"region:us"
] | 2023-12-30T02:57:02+00:00 | {"pretty_name": "Evaluation run of Azazelle/SlimMelodicMaid", "dataset_summary": "Dataset automatically created during the evaluation run of model [Azazelle/SlimMelodicMaid](https://huggingface.co/Azazelle/SlimMelodicMaid) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__SlimMelodicMaid\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T02:54:45.572792](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__SlimMelodicMaid/blob/main/results_2023-12-30T02-54-45.572792.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6495626338556433,\n \"acc_stderr\": 0.03193676074571155,\n \"acc_norm\": 0.6515021650371259,\n \"acc_norm_stderr\": 0.03257111121158258,\n \"mc1\": 0.43084455324357407,\n \"mc1_stderr\": 0.017335272475332363,\n \"mc2\": 0.6087927851947197,\n \"mc2_stderr\": 0.015566919235032412\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6424914675767918,\n \"acc_stderr\": 0.014005494275916576,\n \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537302\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6796454889464251,\n \"acc_stderr\": 0.0046565916786067574,\n \"acc_norm\": 0.8600876319458275,\n \"acc_norm_stderr\": 0.0034618713240671954\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02390115797940253,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02390115797940253\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.02880139219363127,\n \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.02880139219363127\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n \"acc_stderr\": 0.029605103217038325,\n \"acc_norm\": 0.7354260089686099,\n \"acc_norm_stderr\": 0.029605103217038325\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834829,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834829\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3743016759776536,\n \"acc_stderr\": 0.01618544417945717,\n \"acc_norm\": 0.3743016759776536,\n \"acc_norm_stderr\": 0.01618544417945717\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4869621903520209,\n \"acc_stderr\": 0.012765893883835332,\n \"acc_norm\": 0.4869621903520209,\n \"acc_norm_stderr\": 0.012765893883835332\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740543,\n \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740543\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6454248366013072,\n \"acc_stderr\": 0.0193533605475537,\n \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.0193533605475537\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43084455324357407,\n \"mc1_stderr\": 0.017335272475332363,\n \"mc2\": 0.6087927851947197,\n \"mc2_stderr\": 0.015566919235032412\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090254\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6080363912054587,\n \"acc_stderr\": 0.013447140886023815\n }\n}\n```", "repo_url": "https://huggingface.co/Azazelle/SlimMelodicMaid", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T02-54-45.572792.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["**/details_harness|winogrande|5_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T02-54-45.572792.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T02_54_45.572792", "path": ["results_2023-12-30T02-54-45.572792.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T02-54-45.572792.parquet"]}]}]} | 2023-12-30T02:57:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Azazelle/SlimMelodicMaid
Dataset automatically created during the evaluation run of model Azazelle/SlimMelodicMaid on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T02:54:45.572792(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Azazelle/SlimMelodicMaid\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/SlimMelodicMaid on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:54:45.572792(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Azazelle/SlimMelodicMaid\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/SlimMelodicMaid on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T02:54:45.572792(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Azazelle/SlimMelodicMaid\n\n\n\nDataset automatically created during the evaluation run of model Azazelle/SlimMelodicMaid on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T02:54:45.572792(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
17128e3355771abe410ccf9db30689fc87de21a0 | Since The Pile was removed from the original site, I'm worried this dataset might be taken down too. Putting it here just in case.
Original repo: https://huggingface.co/datasets/EleutherAI/the_pile_deduplicated | gmongaras/EleutherAI_the_pile_deduplicated | [
"region:us"
] | 2023-12-30T03:06:00+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 824546807506, "num_examples": 134318121}], "download_size": 451848716133, "dataset_size": 824546807506}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-31T00:51:35+00:00 | [] | [] | TAGS
#region-us
| Since The Pile was removed from the original site, I'm worried this dataset might be taken down too. Putting it here just in case.
Original repo: URL | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
ac535d1bb79d4e642f5760ac9596773dc154cb77 |
---
license: cc-by-4.0
---
| shangrilar/ko_text2sql | [
"region:us"
] | 2023-12-30T03:27:12+00:00 | {"configs": [{"config_name": "origin", "data_files": [{"split": "train", "path": "origin/train.csv"}, {"split": "test", "path": "test.csv"}]}, {"config_name": "clean", "data_files": [{"split": "train", "path": "clean/train.csv"}, {"split": "test", "path": "test.csv"}]}]} | 2024-01-30T02:20:07+00:00 | [] | [] | TAGS
#region-us
|
---
license: cc-by-4.0
---
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
5f193d96e5933c80ae3fd0c1c1fad4466cffa49d | # Dataset Card for "ffmperative_sample_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | salma-remyx/ffmperative_sample_10k | [
"region:us"
] | 2023-12-30T03:29:45+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4049349, "num_examples": 10000}], "download_size": 1276340, "dataset_size": 4049349}} | 2023-12-30T03:29:49+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ffmperative_sample_10k"
More Information needed | [
"# Dataset Card for \"ffmperative_sample_10k\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ffmperative_sample_10k\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"ffmperative_sample_10k\"\n\nMore Information needed"
] |
7c5e76c3bdf0f2b71389d3c341209bf61c8167df |
# Dataset Card for Evaluation run of NeverSleep/Noromaid-7b-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeverSleep/Noromaid-7b-v0.2](https://huggingface.co/NeverSleep/Noromaid-7b-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeverSleep__Noromaid-7b-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T03:29:29.749943](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Noromaid-7b-v0.2/blob/main/results_2023-12-30T03-29-29.749943.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6287666667526353,
"acc_stderr": 0.03244295900262462,
"acc_norm": 0.6345459757350616,
"acc_norm_stderr": 0.03309733796081751,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4609267934370558,
"mc2_stderr": 0.01459133625745078
},
"harness|arc:challenge|25": {
"acc": 0.5861774744027304,
"acc_stderr": 0.014392730009221005,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000326
},
"harness|hellaswag|10": {
"acc": 0.6516630153355906,
"acc_stderr": 0.004754697013354959,
"acc_norm": 0.8492332204740092,
"acc_norm_stderr": 0.0035709011883580687
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6528301886792452,
"acc_stderr": 0.029300101705549652,
"acc_norm": 0.6528301886792452,
"acc_norm_stderr": 0.029300101705549652
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.0291265228345868,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.0291265228345868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150016,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150016
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848057,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848057
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597524,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597524
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611573,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611573
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761976,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761976
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818774,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.01268781841959992,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.01268781841959992
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824866,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824866
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.01918463932809249,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.01918463932809249
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4609267934370558,
"mc2_stderr": 0.01459133625745078
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.36694465504169826,
"acc_stderr": 0.013275883047712211
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NeverSleep__Noromaid-7b-v0.2 | [
"region:us"
] | 2023-12-30T03:30:28+00:00 | {"pretty_name": "Evaluation run of NeverSleep/Noromaid-7b-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeverSleep/Noromaid-7b-v0.2](https://huggingface.co/NeverSleep/Noromaid-7b-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeverSleep__Noromaid-7b-v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T03:29:29.749943](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Noromaid-7b-v0.2/blob/main/results_2023-12-30T03-29-29.749943.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6287666667526353,\n \"acc_stderr\": 0.03244295900262462,\n \"acc_norm\": 0.6345459757350616,\n \"acc_norm_stderr\": 0.03309733796081751,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4609267934370558,\n \"mc2_stderr\": 0.01459133625745078\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221005,\n \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000326\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6516630153355906,\n \"acc_stderr\": 0.004754697013354959,\n \"acc_norm\": 0.8492332204740092,\n \"acc_norm_stderr\": 0.0035709011883580687\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6528301886792452,\n \"acc_stderr\": 0.029300101705549652,\n \"acc_norm\": 0.6528301886792452,\n \"acc_norm_stderr\": 0.029300101705549652\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.0291265228345868,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.0291265228345868\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150016,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150016\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848057,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848057\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597524,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597524\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n \"acc_stderr\": 0.014385525076611573,\n \"acc_norm\": 0.7969348659003831,\n \"acc_norm_stderr\": 0.014385525076611573\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.014736926383761976,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.014736926383761976\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818774,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818774\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.01268781841959992,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.01268781841959992\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.01918463932809249,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.01918463932809249\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4609267934370558,\n \"mc2_stderr\": 0.01459133625745078\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36694465504169826,\n \"acc_stderr\": 0.013275883047712211\n }\n}\n```", "repo_url": "https://huggingface.co/NeverSleep/Noromaid-7b-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|arc:challenge|25_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|arc:challenge|25_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|gsm8k|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|gsm8k|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hellaswag|10_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hellaswag|10_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T03-28-10.331796.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T03-29-29.749943.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["**/details_harness|winogrande|5_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["**/details_harness|winogrande|5_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T03-29-29.749943.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T03_28_10.331796", "path": ["results_2023-12-30T03-28-10.331796.parquet"]}, {"split": "2023_12_30T03_29_29.749943", "path": ["results_2023-12-30T03-29-29.749943.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T03-29-29.749943.parquet"]}]}]} | 2023-12-30T03:32:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NeverSleep/Noromaid-7b-v0.2
Dataset automatically created during the evaluation run of model NeverSleep/Noromaid-7b-v0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T03:29:29.749943(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NeverSleep/Noromaid-7b-v0.2\n\n\n\nDataset automatically created during the evaluation run of model NeverSleep/Noromaid-7b-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T03:29:29.749943(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NeverSleep/Noromaid-7b-v0.2\n\n\n\nDataset automatically created during the evaluation run of model NeverSleep/Noromaid-7b-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T03:29:29.749943(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NeverSleep/Noromaid-7b-v0.2\n\n\n\nDataset automatically created during the evaluation run of model NeverSleep/Noromaid-7b-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T03:29:29.749943(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
4d7f1fc953a8aa434319d2a5a6a3326130ed58b9 |
# CoTAK Dataset: **Co**mmonsense *T*emporal *A*ction *K*nowledge
A dataset resource consisting of short descriptions of action-describing sentences annotated with temporal commonsense knowledge. The dataset consists of instructions extracted from WikiHow, which are annotated with commonsense knowledge-based temporal labels indicating implicitly understood information about the actions described by the sentences, including approximately how long an action takes to perform and approximately how long its effects last for. For short duration actions labeled as taking seconds or minutes, which would be of relevance to automated task planning, e.g. in robotics applications, the dataset also provides scalar values to label the temporal durations of how long actions take to perform.
## Acknowledgment
The content of this repository is based on the work from the [nsjl/aakg-data](https://github.com/nsjl/aakg-data) repository. All credit for the original dataset and its content goes to the contributors of the original repository.
### Original Repository Information
- Original Repository: [https://github.com/nsjl/aakg-data](https://github.com/nsjl/aakg-data)
- Original Maintainer: [nsjl](https://github.com/nsjl)
## License
Their work is partially derived from information extracted from WikiHow, which is used under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 license and this dataset is also distributed under the same license. | kamelliao/CoTAK | [
"license:cc-by-nc-sa-3.0",
"region:us"
] | 2023-12-30T03:47:15+00:00 | {"license": "cc-by-nc-sa-3.0", "configs": [{"config_name": "coarse_grained", "data_files": [{"split": "len5", "path": "data/coarsegrained/json/len5.json"}, {"split": "len6", "path": "data/coarsegrained/json/len6.json"}, {"split": "len7", "path": "data/coarsegrained/json/len7.json"}, {"split": "len8", "path": "data/coarsegrained/json/len8.json"}, {"split": "len9", "path": "data/coarsegrained/json/len9.json"}, {"split": "len10", "path": "data/coarsegrained/json/len10.json"}]}]} | 2023-12-30T03:53:40+00:00 | [] | [] | TAGS
#license-cc-by-nc-sa-3.0 #region-us
|
# CoTAK Dataset: Commonsense *T*emporal *A*ction *K*nowledge
A dataset resource consisting of short descriptions of action-describing sentences annotated with temporal commonsense knowledge. The dataset consists of instructions extracted from WikiHow, which are annotated with commonsense knowledge-based temporal labels indicating implicitly understood information about the actions described by the sentences, including approximately how long an action takes to perform and approximately how long its effects last for. For short duration actions labeled as taking seconds or minutes, which would be of relevance to automated task planning, e.g. in robotics applications, the dataset also provides scalar values to label the temporal durations of how long actions take to perform.
## Acknowledgment
The content of this repository is based on the work from the nsjl/aakg-data repository. All credit for the original dataset and its content goes to the contributors of the original repository.
### Original Repository Information
- Original Repository: URL
- Original Maintainer: nsjl
## License
Their work is partially derived from information extracted from WikiHow, which is used under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 license and this dataset is also distributed under the same license. | [
"# CoTAK Dataset: Commonsense *T*emporal *A*ction *K*nowledge\n\nA dataset resource consisting of short descriptions of action-describing sentences annotated with temporal commonsense knowledge. The dataset consists of instructions extracted from WikiHow, which are annotated with commonsense knowledge-based temporal labels indicating implicitly understood information about the actions described by the sentences, including approximately how long an action takes to perform and approximately how long its effects last for. For short duration actions labeled as taking seconds or minutes, which would be of relevance to automated task planning, e.g. in robotics applications, the dataset also provides scalar values to label the temporal durations of how long actions take to perform.",
"## Acknowledgment\n\nThe content of this repository is based on the work from the nsjl/aakg-data repository. All credit for the original dataset and its content goes to the contributors of the original repository.",
"### Original Repository Information\n\n- Original Repository: URL\n- Original Maintainer: nsjl",
"## License\n\nTheir work is partially derived from information extracted from WikiHow, which is used under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 license and this dataset is also distributed under the same license."
] | [
"TAGS\n#license-cc-by-nc-sa-3.0 #region-us \n",
"# CoTAK Dataset: Commonsense *T*emporal *A*ction *K*nowledge\n\nA dataset resource consisting of short descriptions of action-describing sentences annotated with temporal commonsense knowledge. The dataset consists of instructions extracted from WikiHow, which are annotated with commonsense knowledge-based temporal labels indicating implicitly understood information about the actions described by the sentences, including approximately how long an action takes to perform and approximately how long its effects last for. For short duration actions labeled as taking seconds or minutes, which would be of relevance to automated task planning, e.g. in robotics applications, the dataset also provides scalar values to label the temporal durations of how long actions take to perform.",
"## Acknowledgment\n\nThe content of this repository is based on the work from the nsjl/aakg-data repository. All credit for the original dataset and its content goes to the contributors of the original repository.",
"### Original Repository Information\n\n- Original Repository: URL\n- Original Maintainer: nsjl",
"## License\n\nTheir work is partially derived from information extracted from WikiHow, which is used under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 license and this dataset is also distributed under the same license."
] | [
19,
165,
55,
23,
51
] | [
"passage: TAGS\n#license-cc-by-nc-sa-3.0 #region-us \n# CoTAK Dataset: Commonsense *T*emporal *A*ction *K*nowledge\n\nA dataset resource consisting of short descriptions of action-describing sentences annotated with temporal commonsense knowledge. The dataset consists of instructions extracted from WikiHow, which are annotated with commonsense knowledge-based temporal labels indicating implicitly understood information about the actions described by the sentences, including approximately how long an action takes to perform and approximately how long its effects last for. For short duration actions labeled as taking seconds or minutes, which would be of relevance to automated task planning, e.g. in robotics applications, the dataset also provides scalar values to label the temporal durations of how long actions take to perform.## Acknowledgment\n\nThe content of this repository is based on the work from the nsjl/aakg-data repository. All credit for the original dataset and its content goes to the contributors of the original repository.### Original Repository Information\n\n- Original Repository: URL\n- Original Maintainer: nsjl## License\n\nTheir work is partially derived from information extracted from WikiHow, which is used under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 license and this dataset is also distributed under the same license."
] |
c11868362d3727a962fc68268fa47a408ed521e5 |
# Dataset Card for Evaluation run of dfurman/Mixtral-8x7B-peft-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dfurman/Mixtral-8x7B-peft-v0.1](https://huggingface.co/dfurman/Mixtral-8x7B-peft-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dfurman__Mixtral-8x7B-peft-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T03:53:03.257833](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__Mixtral-8x7B-peft-v0.1/blob/main/results_2023-12-30T03-53-03.257833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6838833486798317,
"acc_stderr": 0.031090263024396967,
"acc_norm": 0.6886589872846014,
"acc_norm_stderr": 0.031695084829465675,
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.5954274012815367,
"mc2_stderr": 0.015021172396422498
},
"harness|arc:challenge|25": {
"acc": 0.6399317406143344,
"acc_stderr": 0.014027516814585186,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719339
},
"harness|hellaswag|10": {
"acc": 0.660426209918343,
"acc_stderr": 0.004725967684806407,
"acc_norm": 0.8602867954590719,
"acc_norm_stderr": 0.0034598069913898367
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7509433962264151,
"acc_stderr": 0.026616482980501704,
"acc_norm": 0.7509433962264151,
"acc_norm_stderr": 0.026616482980501704
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.03353647469713839,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.03353647469713839
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.030976692998534436,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.030976692998534436
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.046774730044912005,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.046774730044912005
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6758620689655173,
"acc_stderr": 0.03900432069185555,
"acc_norm": 0.6758620689655173,
"acc_norm_stderr": 0.03900432069185555
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4973544973544973,
"acc_stderr": 0.025750949678130387,
"acc_norm": 0.4973544973544973,
"acc_norm_stderr": 0.025750949678130387
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.02218571009225225,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.02218571009225225
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5763546798029556,
"acc_stderr": 0.03476725747649038,
"acc_norm": 0.5763546798029556,
"acc_norm_stderr": 0.03476725747649038
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.030117688929503575,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.030117688929503575
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.02686971618742991,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.02686971618742991
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240524,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.0234009289183105,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.0234009289183105
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3962962962962963,
"acc_stderr": 0.029822619458533997,
"acc_norm": 0.3962962962962963,
"acc_norm_stderr": 0.029822619458533997
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.02947248583313608,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.02947248583313608
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8605504587155963,
"acc_stderr": 0.014852421490033055,
"acc_norm": 0.8605504587155963,
"acc_norm_stderr": 0.014852421490033055
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455333,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455333
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065498,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065498
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990936,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990936
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8684546615581098,
"acc_stderr": 0.012086705214250428,
"acc_norm": 0.8684546615581098,
"acc_norm_stderr": 0.012086705214250428
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7976878612716763,
"acc_stderr": 0.021628077380196127,
"acc_norm": 0.7976878612716763,
"acc_norm_stderr": 0.021628077380196127
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310267,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310267
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7781350482315113,
"acc_stderr": 0.02359885829286305,
"acc_norm": 0.7781350482315113,
"acc_norm_stderr": 0.02359885829286305
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8179012345679012,
"acc_stderr": 0.021473491834808355,
"acc_norm": 0.8179012345679012,
"acc_norm_stderr": 0.021473491834808355
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.524822695035461,
"acc_stderr": 0.0297907192438297,
"acc_norm": 0.524822695035461,
"acc_norm_stderr": 0.0297907192438297
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5482398956975228,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.5482398956975228,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144707,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144707
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.01815287105153881,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.01815287105153881
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.5954274012815367,
"mc2_stderr": 0.015021172396422498
},
"harness|winogrande|5": {
"acc": 0.8042620363062352,
"acc_stderr": 0.011151145042218319
},
"harness|gsm8k|5": {
"acc": 0.5140257771038665,
"acc_stderr": 0.013767064940239285
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_dfurman__Mixtral-8x7B-peft-v0.1 | [
"region:us"
] | 2023-12-30T03:55:24+00:00 | {"pretty_name": "Evaluation run of dfurman/Mixtral-8x7B-peft-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [dfurman/Mixtral-8x7B-peft-v0.1](https://huggingface.co/dfurman/Mixtral-8x7B-peft-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dfurman__Mixtral-8x7B-peft-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T03:53:03.257833](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__Mixtral-8x7B-peft-v0.1/blob/main/results_2023-12-30T03-53-03.257833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6838833486798317,\n \"acc_stderr\": 0.031090263024396967,\n \"acc_norm\": 0.6886589872846014,\n \"acc_norm_stderr\": 0.031695084829465675,\n \"mc1\": 0.4320685434516524,\n \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.5954274012815367,\n \"mc2_stderr\": 0.015021172396422498\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6399317406143344,\n \"acc_stderr\": 0.014027516814585186,\n \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719339\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.660426209918343,\n \"acc_stderr\": 0.004725967684806407,\n \"acc_norm\": 0.8602867954590719,\n \"acc_norm_stderr\": 0.0034598069913898367\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7509433962264151,\n \"acc_stderr\": 0.026616482980501704,\n \"acc_norm\": 0.7509433962264151,\n \"acc_norm_stderr\": 0.026616482980501704\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n \"acc_stderr\": 0.03353647469713839,\n \"acc_norm\": 0.7986111111111112,\n \"acc_norm_stderr\": 0.03353647469713839\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.030976692998534436,\n \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.030976692998534436\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.046774730044912005,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.046774730044912005\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6758620689655173,\n \"acc_stderr\": 0.03900432069185555,\n \"acc_norm\": 0.6758620689655173,\n \"acc_norm_stderr\": 0.03900432069185555\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4973544973544973,\n \"acc_stderr\": 0.025750949678130387,\n \"acc_norm\": 0.4973544973544973,\n \"acc_norm_stderr\": 0.025750949678130387\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.02218571009225225,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.02218571009225225\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5763546798029556,\n \"acc_stderr\": 0.03476725747649038,\n \"acc_norm\": 0.5763546798029556,\n \"acc_norm_stderr\": 0.03476725747649038\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503575,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503575\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8282828282828283,\n \"acc_stderr\": 0.02686971618742991,\n \"acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.02686971618742991\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240524,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240524\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.0234009289183105,\n \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.0234009289183105\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3962962962962963,\n \"acc_stderr\": 0.029822619458533997,\n \"acc_norm\": 0.3962962962962963,\n \"acc_norm_stderr\": 0.029822619458533997\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.02947248583313608,\n \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.02947248583313608\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8605504587155963,\n \"acc_stderr\": 0.014852421490033055,\n \"acc_norm\": 0.8605504587155963,\n \"acc_norm_stderr\": 0.014852421490033055\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455333,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455333\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065498,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065498\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.020237149008990936,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.020237149008990936\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n \"acc_stderr\": 0.012086705214250428,\n \"acc_norm\": 0.8684546615581098,\n \"acc_norm_stderr\": 0.012086705214250428\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7976878612716763,\n \"acc_stderr\": 0.021628077380196127,\n \"acc_norm\": 0.7976878612716763,\n \"acc_norm_stderr\": 0.021628077380196127\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n \"acc_stderr\": 0.016051419760310267,\n \"acc_norm\": 0.35977653631284917,\n \"acc_norm_stderr\": 0.016051419760310267\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7781350482315113,\n \"acc_stderr\": 0.02359885829286305,\n \"acc_norm\": 0.7781350482315113,\n \"acc_norm_stderr\": 0.02359885829286305\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8179012345679012,\n \"acc_stderr\": 0.021473491834808355,\n \"acc_norm\": 0.8179012345679012,\n \"acc_norm_stderr\": 0.021473491834808355\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.524822695035461,\n \"acc_stderr\": 0.0297907192438297,\n \"acc_norm\": 0.524822695035461,\n \"acc_norm_stderr\": 0.0297907192438297\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5482398956975228,\n \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.5482398956975228,\n \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144707,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144707\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.01815287105153881,\n \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.01815287105153881\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4320685434516524,\n \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.5954274012815367,\n \"mc2_stderr\": 0.015021172396422498\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218319\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5140257771038665,\n \"acc_stderr\": 0.013767064940239285\n }\n}\n```", "repo_url": "https://huggingface.co/dfurman/Mixtral-8x7B-peft-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|arc:challenge|25_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|gsm8k|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hellaswag|10_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T03-53-03.257833.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["**/details_harness|winogrande|5_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T03-53-03.257833.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T03_53_03.257833", "path": ["results_2023-12-30T03-53-03.257833.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T03-53-03.257833.parquet"]}]}]} | 2023-12-30T03:55:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of dfurman/Mixtral-8x7B-peft-v0.1
Dataset automatically created during the evaluation run of model dfurman/Mixtral-8x7B-peft-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T03:53:03.257833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of dfurman/Mixtral-8x7B-peft-v0.1\n\n\n\nDataset automatically created during the evaluation run of model dfurman/Mixtral-8x7B-peft-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T03:53:03.257833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of dfurman/Mixtral-8x7B-peft-v0.1\n\n\n\nDataset automatically created during the evaluation run of model dfurman/Mixtral-8x7B-peft-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T03:53:03.257833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dfurman/Mixtral-8x7B-peft-v0.1\n\n\n\nDataset automatically created during the evaluation run of model dfurman/Mixtral-8x7B-peft-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T03:53:03.257833(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
9cae03c8e1667fea3ea2a5404bf2cc9b0edd2fb0 | # Dataset Card for Bird_audio_in_China
> **本数据集收集了在中国境内的`401`种鸟种的叫声,作者将其用于一个简易的`TinyML`项目:《基于STM32F746实现实时鸟种识别》**
>
> 鸟叫声音分类作为一种常见的环境音分类任务,也非常适合用于嵌入式AI应用的探索,并且在生态研究、鸟类保护、生物多样性监测都具有重要的现实意义。通过将鸟叫声音分类算法和模型压缩到小型设备中,可以将这些功能带到更多的场景和应用中,例如将鸟叫声音分类技术应用于智能鸟窝监控系统、无人机巡航监测系统等领域,用于评估生态系统的健康状态以及监测气候变化,也可以可以对鸟类的分布情况、迁徙路径、栖息地利用等进行监测和研究。
>
> **申明:声音源来自https://xeno-canto.org**,**这是一个致力于分享来自世界各地的鸟声的网站**
## 数据集描述
本数据集收集了在中国境内的`401`种鸟种的叫声,详细鸟种的列表[点击这里](./birds_list.json),格式如下:
```json
{
"物种的通用名称": {
"sp": "物种的具体名称(绰号)",
"ssp": "亚种名称(亚种加词)",
"en": "物种的英文名称"
}
}
```
- **分享者:** [sakei](https://huggingface.co/sakei)
- **协议:** Aapache-2.0
- **数据样本格式**: .wav
- **大小**: 51.4 GB(55,284,289,304字节)
- **包含**: 6,507个文件
- **注意,部分首字母的鸟种太多,我打包为zip格式方便储存**
## 数据示例样本
[demo1](./Tragopan.89866585.wav)
| sakei/Bird_audio_in_China | [
"license:apache-2.0",
"region:us"
] | 2023-12-30T04:19:37+00:00 | {"license": "apache-2.0"} | 2024-01-24T17:09:04+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| # Dataset Card for Bird_audio_in_China
> 本数据集收集了在中国境内的'401'种鸟种的叫声,作者将其用于一个简易的'TinyML'项目:《基于STM32F746实现实时鸟种识别》
>
> 鸟叫声音分类作为一种常见的环境音分类任务,也非常适合用于嵌入式AI应用的探索,并且在生态研究、鸟类保护、生物多样性监测都具有重要的现实意义。通过将鸟叫声音分类算法和模型压缩到小型设备中,可以将这些功能带到更多的场景和应用中,例如将鸟叫声音分类技术应用于智能鸟窝监控系统、无人机巡航监测系统等领域,用于评估生态系统的健康状态以及监测气候变化,也可以可以对鸟类的分布情况、迁徙路径、栖息地利用等进行监测和研究。
>
> 申明:声音源来自https://URL,这是一个致力于分享来自世界各地的鸟声的网站
## 数据集描述
本数据集收集了在中国境内的'401'种鸟种的叫声,详细鸟种的列表点击这里,格式如下:
- 分享者: sakei
- 协议: Aapache-2.0
- 数据样本格式: .wav
- 大小: 51.4 GB(55,284,289,304字节)
- 包含: 6,507个文件
- 注意,部分首字母的鸟种太多,我打包为zip格式方便储存
## 数据示例样本
demo1
| [
"# Dataset Card for Bird_audio_in_China\n\n> 本数据集收集了在中国境内的'401'种鸟种的叫声,作者将其用于一个简易的'TinyML'项目:《基于STM32F746实现实时鸟种识别》\n>\n> 鸟叫声音分类作为一种常见的环境音分类任务,也非常适合用于嵌入式AI应用的探索,并且在生态研究、鸟类保护、生物多样性监测都具有重要的现实意义。通过将鸟叫声音分类算法和模型压缩到小型设备中,可以将这些功能带到更多的场景和应用中,例如将鸟叫声音分类技术应用于智能鸟窝监控系统、无人机巡航监测系统等领域,用于评估生态系统的健康状态以及监测气候变化,也可以可以对鸟类的分布情况、迁徙路径、栖息地利用等进行监测和研究。\n>\n> 申明:声音源来自https://URL,这是一个致力于分享来自世界各地的鸟声的网站",
"## 数据集描述\n\n本数据集收集了在中国境内的'401'种鸟种的叫声,详细鸟种的列表点击这里,格式如下:\n\n\n\n- 分享者: sakei\n- 协议: Aapache-2.0\n- 数据样本格式: .wav\n- 大小: 51.4 GB(55,284,289,304字节)\n- 包含: 6,507个文件\n- 注意,部分首字母的鸟种太多,我打包为zip格式方便储存",
"## 数据示例样本\n\ndemo1"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Dataset Card for Bird_audio_in_China\n\n> 本数据集收集了在中国境内的'401'种鸟种的叫声,作者将其用于一个简易的'TinyML'项目:《基于STM32F746实现实时鸟种识别》\n>\n> 鸟叫声音分类作为一种常见的环境音分类任务,也非常适合用于嵌入式AI应用的探索,并且在生态研究、鸟类保护、生物多样性监测都具有重要的现实意义。通过将鸟叫声音分类算法和模型压缩到小型设备中,可以将这些功能带到更多的场景和应用中,例如将鸟叫声音分类技术应用于智能鸟窝监控系统、无人机巡航监测系统等领域,用于评估生态系统的健康状态以及监测气候变化,也可以可以对鸟类的分布情况、迁徙路径、栖息地利用等进行监测和研究。\n>\n> 申明:声音源来自https://URL,这是一个致力于分享来自世界各地的鸟声的网站",
"## 数据集描述\n\n本数据集收集了在中国境内的'401'种鸟种的叫声,详细鸟种的列表点击这里,格式如下:\n\n\n\n- 分享者: sakei\n- 协议: Aapache-2.0\n- 数据样本格式: .wav\n- 大小: 51.4 GB(55,284,289,304字节)\n- 包含: 6,507个文件\n- 注意,部分首字母的鸟种太多,我打包为zip格式方便储存",
"## 数据示例样本\n\ndemo1"
] | [
14,
213,
109,
9
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n# Dataset Card for Bird_audio_in_China\n\n> 本数据集收集了在中国境内的'401'种鸟种的叫声,作者将其用于一个简易的'TinyML'项目:《基于STM32F746实现实时鸟种识别》\n>\n> 鸟叫声音分类作为一种常见的环境音分类任务,也非常适合用于嵌入式AI应用的探索,并且在生态研究、鸟类保护、生物多样性监测都具有重要的现实意义。通过将鸟叫声音分类算法和模型压缩到小型设备中,可以将这些功能带到更多的场景和应用中,例如将鸟叫声音分类技术应用于智能鸟窝监控系统、无人机巡航监测系统等领域,用于评估生态系统的健康状态以及监测气候变化,也可以可以对鸟类的分布情况、迁徙路径、栖息地利用等进行监测和研究。\n>\n> 申明:声音源来自https://URL,这是一个致力于分享来自世界各地的鸟声的网站## 数据集描述\n\n本数据集收集了在中国境内的'401'种鸟种的叫声,详细鸟种的列表点击这里,格式如下:\n\n\n\n- 分享者: sakei\n- 协议: Aapache-2.0\n- 数据样本格式: .wav\n- 大小: 51.4 GB(55,284,289,304字节)\n- 包含: 6,507个文件\n- 注意,部分首字母的鸟种太多,我打包为zip格式方便储存## 数据示例样本\n\ndemo1"
] |
d307a1e862efeb27aa1c834a89230298c76d461a |
# Dataset Card for Evaluation run of DopeorNope/SOLARC-MOE-10.7Bx4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DopeorNope/SOLARC-MOE-10.7Bx4](https://huggingface.co/DopeorNope/SOLARC-MOE-10.7Bx4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DopeorNope__SOLARC-MOE-10.7Bx4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T04:44:29.560090](https://huggingface.co/datasets/open-llm-leaderboard/details_DopeorNope__SOLARC-MOE-10.7Bx4/blob/main/results_2023-12-30T04-44-29.560090.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6670293129180128,
"acc_stderr": 0.03161017521655265,
"acc_norm": 0.6679232544824841,
"acc_norm_stderr": 0.032253513481689276,
"mc1": 0.5703794369645043,
"mc1_stderr": 0.01732923458040909,
"mc2": 0.719098916907282,
"mc2_stderr": 0.015045918795928207
},
"harness|arc:challenge|25": {
"acc": 0.6825938566552902,
"acc_stderr": 0.013602239088038167,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520767
},
"harness|hellaswag|10": {
"acc": 0.7134037044413464,
"acc_stderr": 0.004512471612415586,
"acc_norm": 0.8842859988050189,
"acc_norm_stderr": 0.003192279039468747
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236786,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236786
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.02574806587167328,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.02574806587167328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603347,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603347
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.726890756302521,
"acc_stderr": 0.028942004040998167,
"acc_norm": 0.726890756302521,
"acc_norm_stderr": 0.028942004040998167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.0230943295825957,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.0230943295825957
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657567,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39106145251396646,
"acc_stderr": 0.016320763763808383,
"acc_norm": 0.39106145251396646,
"acc_norm_stderr": 0.016320763763808383
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4876140808344198,
"acc_stderr": 0.012766317315473556,
"acc_norm": 0.4876140808344198,
"acc_norm_stderr": 0.012766317315473556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.02655651947004151,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.02655651947004151
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468712,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468712
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857834,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857834
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5703794369645043,
"mc1_stderr": 0.01732923458040909,
"mc2": 0.719098916907282,
"mc2_stderr": 0.015045918795928207
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.01041084977522279
},
"harness|gsm8k|5": {
"acc": 0.643669446550417,
"acc_stderr": 0.013191685031357463
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_DopeorNope__SOLARC-MOE-10.7Bx4 | [
"region:us"
] | 2023-12-30T04:46:51+00:00 | {"pretty_name": "Evaluation run of DopeorNope/SOLARC-MOE-10.7Bx4", "dataset_summary": "Dataset automatically created during the evaluation run of model [DopeorNope/SOLARC-MOE-10.7Bx4](https://huggingface.co/DopeorNope/SOLARC-MOE-10.7Bx4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DopeorNope__SOLARC-MOE-10.7Bx4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T04:44:29.560090](https://huggingface.co/datasets/open-llm-leaderboard/details_DopeorNope__SOLARC-MOE-10.7Bx4/blob/main/results_2023-12-30T04-44-29.560090.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6670293129180128,\n \"acc_stderr\": 0.03161017521655265,\n \"acc_norm\": 0.6679232544824841,\n \"acc_norm_stderr\": 0.032253513481689276,\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.01732923458040909,\n \"mc2\": 0.719098916907282,\n \"mc2_stderr\": 0.015045918795928207\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6825938566552902,\n \"acc_stderr\": 0.013602239088038167,\n \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520767\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7134037044413464,\n \"acc_stderr\": 0.004512471612415586,\n \"acc_norm\": 0.8842859988050189,\n \"acc_norm_stderr\": 0.003192279039468747\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236786,\n \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236786\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.02574806587167328,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.02574806587167328\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n \"acc_stderr\": 0.021732540689329286,\n \"acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.021732540689329286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603347,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603347\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.726890756302521,\n \"acc_stderr\": 0.028942004040998167,\n \"acc_norm\": 0.726890756302521,\n \"acc_norm_stderr\": 0.028942004040998167\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39106145251396646,\n \"acc_stderr\": 0.016320763763808383,\n \"acc_norm\": 0.39106145251396646,\n \"acc_norm_stderr\": 0.016320763763808383\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4876140808344198,\n \"acc_stderr\": 0.012766317315473556,\n \"acc_norm\": 0.4876140808344198,\n \"acc_norm_stderr\": 0.012766317315473556\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.02655651947004151,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.02655651947004151\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468712,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468712\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857834,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857834\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.01732923458040909,\n \"mc2\": 0.719098916907282,\n \"mc2_stderr\": 0.015045918795928207\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.01041084977522279\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.643669446550417,\n \"acc_stderr\": 0.013191685031357463\n }\n}\n```", "repo_url": "https://huggingface.co/DopeorNope/SOLARC-MOE-10.7Bx4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|arc:challenge|25_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|gsm8k|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hellaswag|10_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T04-44-29.560090.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["**/details_harness|winogrande|5_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T04-44-29.560090.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T04_44_29.560090", "path": ["results_2023-12-30T04-44-29.560090.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T04-44-29.560090.parquet"]}]}]} | 2023-12-30T04:47:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of DopeorNope/SOLARC-MOE-10.7Bx4
Dataset automatically created during the evaluation run of model DopeorNope/SOLARC-MOE-10.7Bx4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T04:44:29.560090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of DopeorNope/SOLARC-MOE-10.7Bx4\n\n\n\nDataset automatically created during the evaluation run of model DopeorNope/SOLARC-MOE-10.7Bx4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T04:44:29.560090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DopeorNope/SOLARC-MOE-10.7Bx4\n\n\n\nDataset automatically created during the evaluation run of model DopeorNope/SOLARC-MOE-10.7Bx4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T04:44:29.560090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DopeorNope/SOLARC-MOE-10.7Bx4\n\n\n\nDataset automatically created during the evaluation run of model DopeorNope/SOLARC-MOE-10.7Bx4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T04:44:29.560090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
139b52b54cd463659b819e2847d26042733222de |
# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.2-yi-34b-200k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/dolphin-2.2-yi-34b-200k](https://huggingface.co/cognitivecomputations/dolphin-2.2-yi-34b-200k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2-yi-34b-200k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T04:55:41.011890](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2-yi-34b-200k/blob/main/results_2023-12-30T04-55-41.011890.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5429897039109348,
"acc_stderr": 0.034024777660715086,
"acc_norm": 0.5533854375327871,
"acc_norm_stderr": 0.034866231322601235,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.45933703025376155,
"mc2_stderr": 0.01568029542861706
},
"harness|arc:challenge|25": {
"acc": 0.38822525597269625,
"acc_stderr": 0.014241614207414037,
"acc_norm": 0.4206484641638225,
"acc_norm_stderr": 0.014426211252508403
},
"harness|hellaswag|10": {
"acc": 0.5128460466042621,
"acc_stderr": 0.004988134303021787,
"acc_norm": 0.6813383788090022,
"acc_norm_stderr": 0.004650052150094422
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.02983280811479601,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.02983280811479601
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504514,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504514
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.0266620105785671,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.0266620105785671
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.034454876862647144,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.034454876862647144
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.030748905363909895,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.030748905363909895
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.025317649726448656,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.025317649726448656
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228412,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228412
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.01941644589263603,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.01941644589263603
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6098654708520179,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.6098654708520179,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.047928981709070624,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.047928981709070624
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.028286324075564393,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.028286324075564393
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494581,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382875,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37094972067039106,
"acc_stderr": 0.016155910721341774,
"acc_norm": 0.37094972067039106,
"acc_norm_stderr": 0.016155910721341774
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.027634176689602667,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.027634176689602667
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325963,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325963
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5987654320987654,
"acc_stderr": 0.027272582849839792,
"acc_norm": 0.5987654320987654,
"acc_norm_stderr": 0.027272582849839792
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.02883892147125146,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.02883892147125146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670238,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670238
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.03071356045510849,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.03071356045510849
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.034678266857038245,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.034678266857038245
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.45933703025376155,
"mc2_stderr": 0.01568029542861706
},
"harness|winogrande|5": {
"acc": 0.6424625098658248,
"acc_stderr": 0.01347000744392069
},
"harness|gsm8k|5": {
"acc": 0.0310841546626232,
"acc_stderr": 0.004780296718393351
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2-yi-34b-200k | [
"region:us"
] | 2023-12-30T04:54:32+00:00 | {"pretty_name": "Evaluation run of cognitivecomputations/dolphin-2.2-yi-34b-200k", "dataset_summary": "Dataset automatically created during the evaluation run of model [cognitivecomputations/dolphin-2.2-yi-34b-200k](https://huggingface.co/cognitivecomputations/dolphin-2.2-yi-34b-200k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2-yi-34b-200k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T04:55:41.011890](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2-yi-34b-200k/blob/main/results_2023-12-30T04-55-41.011890.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5429897039109348,\n \"acc_stderr\": 0.034024777660715086,\n \"acc_norm\": 0.5533854375327871,\n \"acc_norm_stderr\": 0.034866231322601235,\n \"mc1\": 0.27906976744186046,\n \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.45933703025376155,\n \"mc2_stderr\": 0.01568029542861706\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.38822525597269625,\n \"acc_stderr\": 0.014241614207414037,\n \"acc_norm\": 0.4206484641638225,\n \"acc_norm_stderr\": 0.014426211252508403\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5128460466042621,\n \"acc_stderr\": 0.004988134303021787,\n \"acc_norm\": 0.6813383788090022,\n \"acc_norm_stderr\": 0.004650052150094422\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.02983280811479601,\n \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.02983280811479601\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.03807301726504514,\n \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.03807301726504514\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936336,\n \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936336\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n \"acc_stderr\": 0.0266620105785671,\n \"acc_norm\": 0.6741935483870968,\n \"acc_norm_stderr\": 0.0266620105785671\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.034454876862647144,\n \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.034454876862647144\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.030748905363909895,\n \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.030748905363909895\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.025317649726448656,\n \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.025317649726448656\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.03218358107742613,\n \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.03218358107742613\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7119266055045872,\n \"acc_stderr\": 0.01941644589263603,\n \"acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.01941644589263603\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.0309645179269234\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.047928981709070624,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.047928981709070624\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.03731133519673893,\n \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.03731133519673893\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n \"acc_stderr\": 0.028286324075564393,\n \"acc_norm\": 0.7521367521367521,\n \"acc_norm_stderr\": 0.028286324075564393\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n \"acc_stderr\": 0.015491088951494581,\n \"acc_norm\": 0.7496807151979565,\n \"acc_norm_stderr\": 0.015491088951494581\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37094972067039106,\n \"acc_stderr\": 0.016155910721341774,\n \"acc_norm\": 0.37094972067039106,\n \"acc_norm_stderr\": 0.016155910721341774\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.027634176689602667,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.027634176689602667\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n \"acc_stderr\": 0.027882383791325963,\n \"acc_norm\": 0.594855305466238,\n \"acc_norm_stderr\": 0.027882383791325963\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5987654320987654,\n \"acc_stderr\": 0.027272582849839792,\n \"acc_norm\": 0.5987654320987654,\n \"acc_norm_stderr\": 0.027272582849839792\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125146,\n \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125146\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.020219083895133924,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.020219083895133924\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.03071356045510849,\n \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.03071356045510849\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.034678266857038245,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.034678266857038245\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.45933703025376155,\n \"mc2_stderr\": 0.01568029542861706\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6424625098658248,\n \"acc_stderr\": 0.01347000744392069\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0310841546626232,\n \"acc_stderr\": 0.004780296718393351\n }\n}\n```", "repo_url": "https://huggingface.co/cognitivecomputations/dolphin-2.2-yi-34b-200k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|arc:challenge|25_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|arc:challenge|25_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|gsm8k|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|gsm8k|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hellaswag|10_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hellaswag|10_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T04-52-22.253489.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T04-55-41.011890.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["**/details_harness|winogrande|5_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["**/details_harness|winogrande|5_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T04-55-41.011890.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T04_52_22.253489", "path": ["results_2023-12-30T04-52-22.253489.parquet"]}, {"split": "2023_12_30T04_55_41.011890", "path": ["results_2023-12-30T04-55-41.011890.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T04-55-41.011890.parquet"]}]}]} | 2023-12-30T04:58:14+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.2-yi-34b-200k
Dataset automatically created during the evaluation run of model cognitivecomputations/dolphin-2.2-yi-34b-200k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T04:55:41.011890(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.2-yi-34b-200k\n\n\n\nDataset automatically created during the evaluation run of model cognitivecomputations/dolphin-2.2-yi-34b-200k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T04:55:41.011890(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.2-yi-34b-200k\n\n\n\nDataset automatically created during the evaluation run of model cognitivecomputations/dolphin-2.2-yi-34b-200k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T04:55:41.011890(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.2-yi-34b-200k\n\n\n\nDataset automatically created during the evaluation run of model cognitivecomputations/dolphin-2.2-yi-34b-200k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T04:55:41.011890(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
12889c51534c0127ccf35d357c3d7860f43868c6 |
# Dataset Card for Evaluation run of beberik/rawr
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [beberik/rawr](https://huggingface.co/beberik/rawr) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beberik__rawr",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T05:23:31.388008](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__rawr/blob/main/results_2023-12-30T05-23-31.388008.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6479677171016213,
"acc_stderr": 0.031930945931820144,
"acc_norm": 0.6502582495559832,
"acc_norm_stderr": 0.03256812569958386,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.520672757948257,
"mc2_stderr": 0.015405430705291837
},
"harness|arc:challenge|25": {
"acc": 0.6168941979522184,
"acc_stderr": 0.014206472661672876,
"acc_norm": 0.6399317406143344,
"acc_norm_stderr": 0.014027516814585188
},
"harness|hellaswag|10": {
"acc": 0.656144194383589,
"acc_stderr": 0.004740229212473455,
"acc_norm": 0.848635729934276,
"acc_norm_stderr": 0.0035767110656195903
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887048,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887048
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947409,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947409
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940876,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940876
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066292,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066292
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546835,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26927374301675977,
"acc_stderr": 0.01483561658288261,
"acc_norm": 0.26927374301675977,
"acc_norm_stderr": 0.01483561658288261
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.024404394928087866,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.024404394928087866
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.01275015180292244,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.01275015180292244
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806308,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806308
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174923,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174923
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502025,
"mc2": 0.520672757948257,
"mc2_stderr": 0.015405430705291837
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597212
},
"harness|gsm8k|5": {
"acc": 0.5807429871114481,
"acc_stderr": 0.013591720959042115
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_beberik__rawr | [
"region:us"
] | 2023-12-30T05:25:47+00:00 | {"pretty_name": "Evaluation run of beberik/rawr", "dataset_summary": "Dataset automatically created during the evaluation run of model [beberik/rawr](https://huggingface.co/beberik/rawr) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beberik__rawr\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T05:23:31.388008](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__rawr/blob/main/results_2023-12-30T05-23-31.388008.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6479677171016213,\n \"acc_stderr\": 0.031930945931820144,\n \"acc_norm\": 0.6502582495559832,\n \"acc_norm_stderr\": 0.03256812569958386,\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.520672757948257,\n \"mc2_stderr\": 0.015405430705291837\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6168941979522184,\n \"acc_stderr\": 0.014206472661672876,\n \"acc_norm\": 0.6399317406143344,\n \"acc_norm_stderr\": 0.014027516814585188\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.656144194383589,\n \"acc_stderr\": 0.004740229212473455,\n \"acc_norm\": 0.848635729934276,\n \"acc_norm_stderr\": 0.0035767110656195903\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887048,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887048\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947409,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947409\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.038498560987940876,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940876\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066292,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066292\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546835,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546835\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26927374301675977,\n \"acc_stderr\": 0.01483561658288261,\n \"acc_norm\": 0.26927374301675977,\n \"acc_norm_stderr\": 0.01483561658288261\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087866,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087866\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n \"acc_stderr\": 0.01275015180292244,\n \"acc_norm\": 0.47196870925684486,\n \"acc_norm_stderr\": 0.01275015180292244\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806308,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806308\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174923,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174923\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502025,\n \"mc2\": 0.520672757948257,\n \"mc2_stderr\": 0.015405430705291837\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597212\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5807429871114481,\n \"acc_stderr\": 0.013591720959042115\n }\n}\n```", "repo_url": "https://huggingface.co/beberik/rawr", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|arc:challenge|25_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|gsm8k|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hellaswag|10_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T05-23-31.388008.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["**/details_harness|winogrande|5_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T05-23-31.388008.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T05_23_31.388008", "path": ["results_2023-12-30T05-23-31.388008.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T05-23-31.388008.parquet"]}]}]} | 2023-12-30T05:26:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of beberik/rawr
Dataset automatically created during the evaluation run of model beberik/rawr on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T05:23:31.388008(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of beberik/rawr\n\n\n\nDataset automatically created during the evaluation run of model beberik/rawr on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T05:23:31.388008(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of beberik/rawr\n\n\n\nDataset automatically created during the evaluation run of model beberik/rawr on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T05:23:31.388008(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
171,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of beberik/rawr\n\n\n\nDataset automatically created during the evaluation run of model beberik/rawr on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T05:23:31.388008(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
3253e3c2899cf0231efe67573f9405af780d306a |
# Dataset Card for Evaluation run of 0x7194633/fialka-13B-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [0x7194633/fialka-13B-v3](https://huggingface.co/0x7194633/fialka-13B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_0x7194633__fialka-13B-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T05:28:30.247566](https://huggingface.co/datasets/open-llm-leaderboard/details_0x7194633__fialka-13B-v3/blob/main/results_2023-12-30T05-28-30.247566.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.267282756151897,
"acc_stderr": 0.031086803905318892,
"acc_norm": 0.2681655321925398,
"acc_norm_stderr": 0.031861580271542414,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.4058016031292479,
"mc2_stderr": 0.014619680021653623
},
"harness|arc:challenge|25": {
"acc": 0.28498293515358364,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.3097269624573379,
"acc_norm_stderr": 0.013512058415238361
},
"harness|hellaswag|10": {
"acc": 0.38836885082652856,
"acc_stderr": 0.004863831364848084,
"acc_norm": 0.48834893447520417,
"acc_norm_stderr": 0.004988426528513012
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073464,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073464
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24528301886792453,
"acc_stderr": 0.02648035717989568,
"acc_norm": 0.24528301886792453,
"acc_norm_stderr": 0.02648035717989568
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.032147373020294696,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.032147373020294696
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322716,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322716
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.028185441301234102,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.028185441301234102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.0409698513984367,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.0409698513984367
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727772,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727772
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21957671957671956,
"acc_stderr": 0.021320018599770348,
"acc_norm": 0.21957671957671956,
"acc_norm_stderr": 0.021320018599770348
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462836,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.02967833314144444,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.02967833314144444
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3484848484848485,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.3484848484848485,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29015544041450775,
"acc_stderr": 0.03275264467791515,
"acc_norm": 0.29015544041450775,
"acc_norm_stderr": 0.03275264467791515
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3282051282051282,
"acc_stderr": 0.02380763319865726,
"acc_norm": 0.3282051282051282,
"acc_norm_stderr": 0.02380763319865726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882385,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882385
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27889908256880735,
"acc_stderr": 0.01922746887646353,
"acc_norm": 0.27889908256880735,
"acc_norm_stderr": 0.01922746887646353
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501964,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501964
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.02917868230484255,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.02917868230484255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2914798206278027,
"acc_stderr": 0.030500283176545913,
"acc_norm": 0.2914798206278027,
"acc_norm_stderr": 0.030500283176545913
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.027421007295392916,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.027421007295392916
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.022122439772480778,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.022122439772480778
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.022779719088733396,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.022779719088733396
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.027807990141320186,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.027807990141320186
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25097783572359844,
"acc_stderr": 0.011073730299187233,
"acc_norm": 0.25097783572359844,
"acc_norm_stderr": 0.011073730299187233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.01755581809132226,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.01755581809132226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721377,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721377
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.35918367346938773,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.35918367346938773,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.01539211880501503,
"mc2": 0.4058016031292479,
"mc2_stderr": 0.014619680021653623
},
"harness|winogrande|5": {
"acc": 0.5943172849250198,
"acc_stderr": 0.013800206336014201
},
"harness|gsm8k|5": {
"acc": 0.01288855193328279,
"acc_stderr": 0.003106901266499657
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_0x7194633__fialka-13B-v3 | [
"region:us"
] | 2023-12-30T05:30:20+00:00 | {"pretty_name": "Evaluation run of 0x7194633/fialka-13B-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [0x7194633/fialka-13B-v3](https://huggingface.co/0x7194633/fialka-13B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_0x7194633__fialka-13B-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T05:28:30.247566](https://huggingface.co/datasets/open-llm-leaderboard/details_0x7194633__fialka-13B-v3/blob/main/results_2023-12-30T05-28-30.247566.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.267282756151897,\n \"acc_stderr\": 0.031086803905318892,\n \"acc_norm\": 0.2681655321925398,\n \"acc_norm_stderr\": 0.031861580271542414,\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.4058016031292479,\n \"mc2_stderr\": 0.014619680021653623\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.28498293515358364,\n \"acc_stderr\": 0.013191348179838793,\n \"acc_norm\": 0.3097269624573379,\n \"acc_norm_stderr\": 0.013512058415238361\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.38836885082652856,\n \"acc_stderr\": 0.004863831364848084,\n \"acc_norm\": 0.48834893447520417,\n \"acc_norm_stderr\": 0.004988426528513012\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073464,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073464\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.24528301886792453,\n \"acc_stderr\": 0.02648035717989568,\n \"acc_norm\": 0.24528301886792453,\n \"acc_norm_stderr\": 0.02648035717989568\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.032147373020294696,\n \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.032147373020294696\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322716,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322716\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.028185441301234102,\n \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.028185441301234102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.0409698513984367,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.0409698513984367\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727772,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727772\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21957671957671956,\n \"acc_stderr\": 0.021320018599770348,\n \"acc_norm\": 0.21957671957671956,\n \"acc_norm_stderr\": 0.021320018599770348\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n \"acc_stderr\": 0.024892469172462836,\n \"acc_norm\": 0.25806451612903225,\n \"acc_norm_stderr\": 0.024892469172462836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2315270935960591,\n \"acc_stderr\": 0.02967833314144444,\n \"acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.02967833314144444\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.03275264467791515,\n \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.03275264467791515\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3282051282051282,\n \"acc_stderr\": 0.02380763319865726,\n \"acc_norm\": 0.3282051282051282,\n \"acc_norm_stderr\": 0.02380763319865726\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882385,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882385\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.27889908256880735,\n \"acc_stderr\": 0.01922746887646353,\n \"acc_norm\": 0.27889908256880735,\n \"acc_norm_stderr\": 0.01922746887646353\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501964,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501964\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.27848101265822783,\n \"acc_stderr\": 0.02917868230484255,\n \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.02917868230484255\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2914798206278027,\n \"acc_stderr\": 0.030500283176545913,\n \"acc_norm\": 0.2914798206278027,\n \"acc_norm_stderr\": 0.030500283176545913\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n \"acc_stderr\": 0.027421007295392916,\n \"acc_norm\": 0.2264957264957265,\n \"acc_norm_stderr\": 0.027421007295392916\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.022122439772480778,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.022122439772480778\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.022779719088733396,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.022779719088733396\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.027807990141320186,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.027807990141320186\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25097783572359844,\n \"acc_stderr\": 0.011073730299187233,\n \"acc_norm\": 0.25097783572359844,\n \"acc_norm_stderr\": 0.011073730299187233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132226,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132226\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.04069306319721377,\n \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.04069306319721377\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.35918367346938773,\n \"acc_stderr\": 0.030713560455108493,\n \"acc_norm\": 0.35918367346938773,\n \"acc_norm_stderr\": 0.030713560455108493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.21393034825870647,\n \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.01539211880501503,\n \"mc2\": 0.4058016031292479,\n \"mc2_stderr\": 0.014619680021653623\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5943172849250198,\n \"acc_stderr\": 0.013800206336014201\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \"acc_stderr\": 0.003106901266499657\n }\n}\n```", "repo_url": "https://huggingface.co/0x7194633/fialka-13B-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|arc:challenge|25_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|gsm8k|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hellaswag|10_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T05-28-30.247566.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["**/details_harness|winogrande|5_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T05-28-30.247566.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T05_28_30.247566", "path": ["results_2023-12-30T05-28-30.247566.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T05-28-30.247566.parquet"]}]}]} | 2023-12-30T05:30:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of 0x7194633/fialka-13B-v3
Dataset automatically created during the evaluation run of model 0x7194633/fialka-13B-v3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T05:28:30.247566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of 0x7194633/fialka-13B-v3\n\n\n\nDataset automatically created during the evaluation run of model 0x7194633/fialka-13B-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T05:28:30.247566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of 0x7194633/fialka-13B-v3\n\n\n\nDataset automatically created during the evaluation run of model 0x7194633/fialka-13B-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T05:28:30.247566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of 0x7194633/fialka-13B-v3\n\n\n\nDataset automatically created during the evaluation run of model 0x7194633/fialka-13B-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T05:28:30.247566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
df579c1c23792f6d5c3561321e055ae4a9dbec97 |
# Dataset Card for Evaluation run of martyn/mixtral-megamerge-dare-8x7b-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [martyn/mixtral-megamerge-dare-8x7b-v2](https://huggingface.co/martyn/mixtral-megamerge-dare-8x7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_martyn__mixtral-megamerge-dare-8x7b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T07:03:35.967501](https://huggingface.co/datasets/open-llm-leaderboard/details_martyn__mixtral-megamerge-dare-8x7b-v2/blob/main/results_2024-01-14T07-03-35.967501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6893459569280364,
"acc_stderr": 0.030858049040324388,
"acc_norm": 0.6938293567967714,
"acc_norm_stderr": 0.03145368794832943,
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5381182686685855,
"mc2_stderr": 0.0153563125426782
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759091,
"acc_norm": 0.6646757679180887,
"acc_norm_stderr": 0.013796182947785562
},
"harness|hellaswag|10": {
"acc": 0.6766580362477594,
"acc_stderr": 0.004667960519938637,
"acc_norm": 0.8610834495120494,
"acc_norm_stderr": 0.003451525868724678
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7660377358490567,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.7660377358490567,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.03476599607516477,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.03476599607516477
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.030783736757745657,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.030783736757745657
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04644602091222317,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04644602091222317
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.02572209706438853,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.02572209706438853
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268556,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268556
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02655220782821529,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02655220782821529
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678185,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678185
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857403,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857403
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372167,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.040428099613956346,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.040428099613956346
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568606,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568606
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.02164419572795517,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.02164419572795517
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.028568079464714284,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.028568079464714284
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.01911989279892498,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.01911989279892498
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8825031928480205,
"acc_stderr": 0.011515102251977185,
"acc_norm": 0.8825031928480205,
"acc_norm_stderr": 0.011515102251977185
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7774566473988439,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.7774566473988439,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.016588680864530622,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.016588680864530622
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824785,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824785
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.024723861504771696,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.024723861504771696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8179012345679012,
"acc_stderr": 0.02147349183480834,
"acc_norm": 0.8179012345679012,
"acc_norm_stderr": 0.02147349183480834
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5260756192959583,
"acc_stderr": 0.012752858346533143,
"acc_norm": 0.5260756192959583,
"acc_norm_stderr": 0.012752858346533143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144714,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144714
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146627,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146627
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073142,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073142
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776348,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776348
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.02464806896136615,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.02464806896136615
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5381182686685855,
"mc2_stderr": 0.0153563125426782
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.011285013754047443
},
"harness|gsm8k|5": {
"acc": 0.5390447308567097,
"acc_stderr": 0.01373042844911634
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_martyn__mixtral-megamerge-dare-8x7b-v2 | [
"region:us"
] | 2023-12-30T05:32:07+00:00 | {"pretty_name": "Evaluation run of martyn/mixtral-megamerge-dare-8x7b-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [martyn/mixtral-megamerge-dare-8x7b-v2](https://huggingface.co/martyn/mixtral-megamerge-dare-8x7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_martyn__mixtral-megamerge-dare-8x7b-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T07:03:35.967501](https://huggingface.co/datasets/open-llm-leaderboard/details_martyn__mixtral-megamerge-dare-8x7b-v2/blob/main/results_2024-01-14T07-03-35.967501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6893459569280364,\n \"acc_stderr\": 0.030858049040324388,\n \"acc_norm\": 0.6938293567967714,\n \"acc_norm_stderr\": 0.03145368794832943,\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5381182686685855,\n \"mc2_stderr\": 0.0153563125426782\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759091,\n \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.013796182947785562\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6766580362477594,\n \"acc_stderr\": 0.004667960519938637,\n \"acc_norm\": 0.8610834495120494,\n \"acc_norm_stderr\": 0.003451525868724678\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7660377358490567,\n \"acc_stderr\": 0.02605529690115292,\n \"acc_norm\": 0.7660377358490567,\n \"acc_norm_stderr\": 0.02605529690115292\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.03476599607516477,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.03476599607516477\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.030783736757745657,\n \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.030783736757745657\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04644602091222317,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04644602091222317\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.02572209706438853,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.02572209706438853\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268556,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268556\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162933,\n \"acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162933\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02655220782821529,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02655220782821529\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678185,\n \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678185\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372167,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372167\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4304635761589404,\n \"acc_stderr\": 0.040428099613956346,\n \"acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.040428099613956346\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568606,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568606\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8734177215189873,\n \"acc_stderr\": 0.02164419572795517,\n \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.02164419572795517\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n \"acc_stderr\": 0.028568079464714284,\n \"acc_norm\": 0.7623318385650224,\n \"acc_norm_stderr\": 0.028568079464714284\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n \"acc_stderr\": 0.01911989279892498,\n \"acc_norm\": 0.905982905982906,\n \"acc_norm_stderr\": 0.01911989279892498\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8825031928480205,\n \"acc_stderr\": 0.011515102251977185,\n \"acc_norm\": 0.8825031928480205,\n \"acc_norm_stderr\": 0.011515102251977185\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7774566473988439,\n \"acc_stderr\": 0.02239421566194282,\n \"acc_norm\": 0.7774566473988439,\n \"acc_norm_stderr\": 0.02239421566194282\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n \"acc_stderr\": 0.016588680864530622,\n \"acc_norm\": 0.43687150837988825,\n \"acc_norm_stderr\": 0.016588680864530622\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824785,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824785\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n \"acc_stderr\": 0.024723861504771696,\n \"acc_norm\": 0.7459807073954984,\n \"acc_norm_stderr\": 0.024723861504771696\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8179012345679012,\n \"acc_stderr\": 0.02147349183480834,\n \"acc_norm\": 0.8179012345679012,\n \"acc_norm_stderr\": 0.02147349183480834\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5260756192959583,\n \"acc_stderr\": 0.012752858346533143,\n \"acc_norm\": 0.5260756192959583,\n \"acc_norm_stderr\": 0.012752858346533143\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144714,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144714\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146627,\n \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146627\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073142,\n \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073142\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776348,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776348\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136615,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136615\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5381182686685855,\n \"mc2_stderr\": 0.0153563125426782\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047443\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5390447308567097,\n \"acc_stderr\": 0.01373042844911634\n }\n}\n```", "repo_url": "https://huggingface.co/martyn/mixtral-megamerge-dare-8x7b-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|arc:challenge|25_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|gsm8k|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hellaswag|10_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T05-29-42.877367.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-03-35.967501.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["**/details_harness|winogrande|5_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["**/details_harness|winogrande|5_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T07-03-35.967501.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T05_29_42.877367", "path": ["results_2023-12-30T05-29-42.877367.parquet"]}, {"split": "2024_01_14T07_03_35.967501", "path": ["results_2024-01-14T07-03-35.967501.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T07-03-35.967501.parquet"]}]}]} | 2024-01-14T07:06:16+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of martyn/mixtral-megamerge-dare-8x7b-v2
Dataset automatically created during the evaluation run of model martyn/mixtral-megamerge-dare-8x7b-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T07:03:35.967501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of martyn/mixtral-megamerge-dare-8x7b-v2\n\n\n\nDataset automatically created during the evaluation run of model martyn/mixtral-megamerge-dare-8x7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T07:03:35.967501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of martyn/mixtral-megamerge-dare-8x7b-v2\n\n\n\nDataset automatically created during the evaluation run of model martyn/mixtral-megamerge-dare-8x7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T07:03:35.967501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of martyn/mixtral-megamerge-dare-8x7b-v2\n\n\n\nDataset automatically created during the evaluation run of model martyn/mixtral-megamerge-dare-8x7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-14T07:03:35.967501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
06258da340cc536e0681f4d911543df62f2b38ca | # Bhagavad Gita Dataset
## Description
This dataset contains the Bhagavad Gita, a 700-verse Hindu scripture. It is part of the Indian epic Mahabharata (chapters 23–40 of the Bhishma Parva) and is written in the form of a dialogue between Prince Arjuna and Krishna, who serves as his charioteer. In the dialogue, Krishna provides guidance on how to deal with moral dilemmas and the path to spiritual enlightenment.
## Contents
The dataset contains the following columns:
- Verse: The verse in the Bhagavad Gita.
- Chapter: The chapter in which the verse is found.
- Meaning: The general meaning or theme of the verse.
| OEvortex/Bhagavad_Gita | [
"license:mit",
"region:us"
] | 2023-12-30T05:33:13+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "S.No.", "dtype": "int64"}, {"name": "Title", "dtype": "string"}, {"name": "Chapter", "dtype": "string"}, {"name": "Verse", "dtype": "string"}, {"name": "Sanskrit Anuvad", "dtype": "string"}, {"name": "Hindi Anuvad", "dtype": "string"}, {"name": "Enlgish Translation", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 697874, "num_examples": 700}], "download_size": 287784, "dataset_size": 697874}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-05T12:32:52+00:00 | [] | [] | TAGS
#license-mit #region-us
| # Bhagavad Gita Dataset
## Description
This dataset contains the Bhagavad Gita, a 700-verse Hindu scripture. It is part of the Indian epic Mahabharata (chapters 23–40 of the Bhishma Parva) and is written in the form of a dialogue between Prince Arjuna and Krishna, who serves as his charioteer. In the dialogue, Krishna provides guidance on how to deal with moral dilemmas and the path to spiritual enlightenment.
## Contents
The dataset contains the following columns:
- Verse: The verse in the Bhagavad Gita.
- Chapter: The chapter in which the verse is found.
- Meaning: The general meaning or theme of the verse.
| [
"# Bhagavad Gita Dataset",
"## Description\nThis dataset contains the Bhagavad Gita, a 700-verse Hindu scripture. It is part of the Indian epic Mahabharata (chapters 23–40 of the Bhishma Parva) and is written in the form of a dialogue between Prince Arjuna and Krishna, who serves as his charioteer. In the dialogue, Krishna provides guidance on how to deal with moral dilemmas and the path to spiritual enlightenment.",
"## Contents\nThe dataset contains the following columns:\n\n- Verse: The verse in the Bhagavad Gita.\n- Chapter: The chapter in which the verse is found.\n- Meaning: The general meaning or theme of the verse."
] | [
"TAGS\n#license-mit #region-us \n",
"# Bhagavad Gita Dataset",
"## Description\nThis dataset contains the Bhagavad Gita, a 700-verse Hindu scripture. It is part of the Indian epic Mahabharata (chapters 23–40 of the Bhishma Parva) and is written in the form of a dialogue between Prince Arjuna and Krishna, who serves as his charioteer. In the dialogue, Krishna provides guidance on how to deal with moral dilemmas and the path to spiritual enlightenment.",
"## Contents\nThe dataset contains the following columns:\n\n- Verse: The verse in the Bhagavad Gita.\n- Chapter: The chapter in which the verse is found.\n- Meaning: The general meaning or theme of the verse."
] | [
11,
7,
96,
53
] | [
"passage: TAGS\n#license-mit #region-us \n# Bhagavad Gita Dataset## Description\nThis dataset contains the Bhagavad Gita, a 700-verse Hindu scripture. It is part of the Indian epic Mahabharata (chapters 23–40 of the Bhishma Parva) and is written in the form of a dialogue between Prince Arjuna and Krishna, who serves as his charioteer. In the dialogue, Krishna provides guidance on how to deal with moral dilemmas and the path to spiritual enlightenment.## Contents\nThe dataset contains the following columns:\n\n- Verse: The verse in the Bhagavad Gita.\n- Chapter: The chapter in which the verse is found.\n- Meaning: The general meaning or theme of the verse."
] |
3db6d4a6e2ea8b464b1a7bb12bd1569535ad4f74 |
# Dataset Card for Evaluation run of spmurrayzzz/Mistral-Syndicate-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [spmurrayzzz/Mistral-Syndicate-7B](https://huggingface.co/spmurrayzzz/Mistral-Syndicate-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_spmurrayzzz__Mistral-Syndicate-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T05:59:03.827358](https://huggingface.co/datasets/open-llm-leaderboard/details_spmurrayzzz__Mistral-Syndicate-7B/blob/main/results_2023-12-30T05-59-03.827358.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.605141246638436,
"acc_stderr": 0.03295805344662521,
"acc_norm": 0.6090522236898664,
"acc_norm_stderr": 0.03362572955811539,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.43728309890245215,
"mc2_stderr": 0.014415164176795973
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.01449442158425652,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.014264122124938215
},
"harness|hellaswag|10": {
"acc": 0.6285600477992431,
"acc_stderr": 0.004822022254886021,
"acc_norm": 0.8288189603664609,
"acc_norm_stderr": 0.0037589728166275895
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752056,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752056
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.7,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.024784316942156395,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.024784316942156395
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.8,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508773,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508773
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22569832402234638,
"acc_stderr": 0.013981395058455057,
"acc_norm": 0.22569832402234638,
"acc_norm_stderr": 0.013981395058455057
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399684,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399684
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.019488025745529675,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.019488025745529675
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.03093285879278985,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.03093285879278985
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.43728309890245215,
"mc2_stderr": 0.014415164176795973
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.4404852160727824,
"acc_stderr": 0.013674572131693888
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_spmurrayzzz__Mistral-Syndicate-7B | [
"region:us"
] | 2023-12-30T05:53:44+00:00 | {"pretty_name": "Evaluation run of spmurrayzzz/Mistral-Syndicate-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [spmurrayzzz/Mistral-Syndicate-7B](https://huggingface.co/spmurrayzzz/Mistral-Syndicate-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_spmurrayzzz__Mistral-Syndicate-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T05:59:03.827358](https://huggingface.co/datasets/open-llm-leaderboard/details_spmurrayzzz__Mistral-Syndicate-7B/blob/main/results_2023-12-30T05-59-03.827358.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.605141246638436,\n \"acc_stderr\": 0.03295805344662521,\n \"acc_norm\": 0.6090522236898664,\n \"acc_norm_stderr\": 0.03362572955811539,\n \"mc1\": 0.29253365973072215,\n \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.43728309890245215,\n \"mc2_stderr\": 0.014415164176795973\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.01449442158425652,\n \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938215\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6285600477992431,\n \"acc_stderr\": 0.004822022254886021,\n \"acc_norm\": 0.8288189603664609,\n \"acc_norm_stderr\": 0.0037589728166275895\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752056,\n \"acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752056\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.024784316942156395,\n \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.024784316942156395\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n \"acc_stderr\": 0.030216831011508773,\n \"acc_norm\": 0.7174887892376681,\n \"acc_norm_stderr\": 0.030216831011508773\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22569832402234638,\n \"acc_stderr\": 0.013981395058455057,\n \"acc_norm\": 0.22569832402234638,\n \"acc_norm_stderr\": 0.013981395058455057\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291474,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291474\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n \"acc_stderr\": 0.012697046024399684,\n \"acc_norm\": 0.44654498044328556,\n \"acc_norm_stderr\": 0.012697046024399684\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529675,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529675\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.03093285879278985,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.03093285879278985\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.43728309890245215,\n \"mc2_stderr\": 0.014415164176795973\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4404852160727824,\n \"acc_stderr\": 0.013674572131693888\n }\n}\n```", "repo_url": "https://huggingface.co/spmurrayzzz/Mistral-Syndicate-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|arc:challenge|25_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|arc:challenge|25_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|gsm8k|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|gsm8k|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hellaswag|10_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hellaswag|10_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T05-51-29.447448.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T05-59-03.827358.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["**/details_harness|winogrande|5_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["**/details_harness|winogrande|5_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T05-59-03.827358.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T05_51_29.447448", "path": ["results_2023-12-30T05-51-29.447448.parquet"]}, {"split": "2023_12_30T05_59_03.827358", "path": ["results_2023-12-30T05-59-03.827358.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T05-59-03.827358.parquet"]}]}]} | 2023-12-30T06:01:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of spmurrayzzz/Mistral-Syndicate-7B
Dataset automatically created during the evaluation run of model spmurrayzzz/Mistral-Syndicate-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T05:59:03.827358(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of spmurrayzzz/Mistral-Syndicate-7B\n\n\n\nDataset automatically created during the evaluation run of model spmurrayzzz/Mistral-Syndicate-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T05:59:03.827358(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of spmurrayzzz/Mistral-Syndicate-7B\n\n\n\nDataset automatically created during the evaluation run of model spmurrayzzz/Mistral-Syndicate-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T05:59:03.827358(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of spmurrayzzz/Mistral-Syndicate-7B\n\n\n\nDataset automatically created during the evaluation run of model spmurrayzzz/Mistral-Syndicate-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T05:59:03.827358(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
af7ff3beedc3d98c86010cc582767c47547efe68 | # Dataset Card for "boolq-translated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ai4bharat/boolq-hi | [
"region:us"
] | 2023-12-30T06:03:37+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "bool"}, {"name": "passage", "dtype": "string"}, {"name": "itv2 hi question", "dtype": "string"}, {"name": "itv2 hi passage", "dtype": "string"}, {"name": "itv2 hi question en", "dtype": "string"}, {"name": "itv2 hi passage en", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25356277, "num_examples": 9427}, {"name": "validation", "num_bytes": 8638084, "num_examples": 3270}], "download_size": 16628479, "dataset_size": 33994361}} | 2023-12-30T06:15:05+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "boolq-translated"
More Information needed | [
"# Dataset Card for \"boolq-translated\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"boolq-translated\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"boolq-translated\"\n\nMore Information needed"
] |
0f7d82b05590d88889c879431c2625b9d9cba753 | # Dataset Card for "gamio-ai-authorLM-dataset-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | 0x7o/gamio-ai-authorLM-dataset-v2 | [
"region:us"
] | 2023-12-30T06:19:14+00:00 | {"dataset_info": {"features": [{"name": "texts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5247208, "num_examples": 200}], "download_size": 2361264, "dataset_size": 5247208}} | 2023-12-30T06:19:18+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "gamio-ai-authorLM-dataset-v2"
More Information needed | [
"# Dataset Card for \"gamio-ai-authorLM-dataset-v2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"gamio-ai-authorLM-dataset-v2\"\n\nMore Information needed"
] | [
6,
25
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"gamio-ai-authorLM-dataset-v2\"\n\nMore Information needed"
] |
b23a0cf91bf013a4170e39aa64f0ab406558fcd1 |
This Hindi text dataset, comprising 10,000 rows, offers a comprehensive array of content encompassing diverse aspects of Hindi topics,
ranging from education and comedy to computer science and machine learning. The dataset serves as a valuable resource for finetuning or training models,
with a specific emphasis on Hinglish language processing. Its extensive coverage across various domains makes it particularly advantageous for enhancing the performance of models tailored to Hinglish, such as the openHathi model.
Researchers, developers, and practitioners can leverage this dataset to refine language models, ensuring improved accuracy and relevance in understanding and generating content across a spectrum of Hindi subjects. | shuvom/YThindi | [
"language:hi",
"license:mit",
"hinglish",
"hindi",
"openhathi",
"region:us"
] | 2023-12-30T06:38:00+00:00 | {"language": ["hi"], "license": "mit", "tags": ["hinglish", "hindi", "openhathi"]} | 2023-12-30T14:00:29+00:00 | [] | [
"hi"
] | TAGS
#language-Hindi #license-mit #hinglish #hindi #openhathi #region-us
|
This Hindi text dataset, comprising 10,000 rows, offers a comprehensive array of content encompassing diverse aspects of Hindi topics,
ranging from education and comedy to computer science and machine learning. The dataset serves as a valuable resource for finetuning or training models,
with a specific emphasis on Hinglish language processing. Its extensive coverage across various domains makes it particularly advantageous for enhancing the performance of models tailored to Hinglish, such as the openHathi model.
Researchers, developers, and practitioners can leverage this dataset to refine language models, ensuring improved accuracy and relevance in understanding and generating content across a spectrum of Hindi subjects. | [] | [
"TAGS\n#language-Hindi #license-mit #hinglish #hindi #openhathi #region-us \n"
] | [
24
] | [
"passage: TAGS\n#language-Hindi #license-mit #hinglish #hindi #openhathi #region-us \n"
] |
5df62818977711aee884bf697b297413f00d174d |
# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-1.3b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/speechless-coder-ds-1.3b](https://huggingface.co/uukuguy/speechless-coder-ds-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-coder-ds-1.3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T06:48:01.416618](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-coder-ds-1.3b/blob/main/results_2023-12-30T06-48-01.416618.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2506488389989993,
"acc_stderr": 0.030580639760232627,
"acc_norm": 0.25128428880523795,
"acc_norm_stderr": 0.03131936078924142,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156507,
"mc2": 0.4211587968245106,
"mc2_stderr": 0.01485785907132671
},
"harness|arc:challenge|25": {
"acc": 0.23890784982935154,
"acc_stderr": 0.012461071376316614,
"acc_norm": 0.26535836177474403,
"acc_norm_stderr": 0.012902554762313967
},
"harness|hellaswag|10": {
"acc": 0.3313085042820155,
"acc_stderr": 0.004697217912462985,
"acc_norm": 0.39494124676359293,
"acc_norm_stderr": 0.0048783902265917105
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.035478541985608236,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.035478541985608236
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.0301675334686327,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.0301675334686327
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708094,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708094
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.03036358219723817,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.03036358219723817
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708614,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708614
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276862,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276862
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.21935483870967742,
"acc_stderr": 0.02354079935872329,
"acc_norm": 0.21935483870967742,
"acc_norm_stderr": 0.02354079935872329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.02874898368994107,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.02874898368994107
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.031195840877700314,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.031195840877700314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463196,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463196
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.02592887613276612,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.02592887613276612
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.018125669180861496,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.018125669180861496
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.03058759135160424,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.03058759135160424
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842562,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842562
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286773,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286773
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3247863247863248,
"acc_stderr": 0.030679022765498828,
"acc_norm": 0.3247863247863248,
"acc_norm_stderr": 0.030679022765498828
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2937420178799489,
"acc_stderr": 0.016287759388491682,
"acc_norm": 0.2937420178799489,
"acc_norm_stderr": 0.016287759388491682
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2837988826815642,
"acc_stderr": 0.015078358970751778,
"acc_norm": 0.2837988826815642,
"acc_norm_stderr": 0.015078358970751778
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2090032154340836,
"acc_stderr": 0.02309314039837422,
"acc_norm": 0.2090032154340836,
"acc_norm_stderr": 0.02309314039837422
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290392,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290392
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23598435462842243,
"acc_stderr": 0.01084480266966268,
"acc_norm": 0.23598435462842243,
"acc_norm_stderr": 0.01084480266966268
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.15918367346938775,
"acc_stderr": 0.02342097206916635,
"acc_norm": 0.15918367346938775,
"acc_norm_stderr": 0.02342097206916635
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156507,
"mc2": 0.4211587968245106,
"mc2_stderr": 0.01485785907132671
},
"harness|winogrande|5": {
"acc": 0.5303867403314917,
"acc_stderr": 0.014026510839428737
},
"harness|gsm8k|5": {
"acc": 0.02350265352539803,
"acc_stderr": 0.004172883669643965
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_uukuguy__speechless-coder-ds-1.3b | [
"region:us"
] | 2023-12-30T06:50:17+00:00 | {"pretty_name": "Evaluation run of uukuguy/speechless-coder-ds-1.3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-coder-ds-1.3b](https://huggingface.co/uukuguy/speechless-coder-ds-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-coder-ds-1.3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T06:48:01.416618](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-coder-ds-1.3b/blob/main/results_2023-12-30T06-48-01.416618.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2506488389989993,\n \"acc_stderr\": 0.030580639760232627,\n \"acc_norm\": 0.25128428880523795,\n \"acc_norm_stderr\": 0.03131936078924142,\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156507,\n \"mc2\": 0.4211587968245106,\n \"mc2_stderr\": 0.01485785907132671\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.23890784982935154,\n \"acc_stderr\": 0.012461071376316614,\n \"acc_norm\": 0.26535836177474403,\n \"acc_norm_stderr\": 0.012902554762313967\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3313085042820155,\n \"acc_stderr\": 0.004697217912462985,\n \"acc_norm\": 0.39494124676359293,\n \"acc_norm_stderr\": 0.0048783902265917105\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n \"acc_stderr\": 0.035478541985608236,\n \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.035478541985608236\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.0301675334686327,\n \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.0301675334686327\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708094,\n \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708094\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.03036358219723817,\n \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.03036358219723817\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708614,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708614\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276862,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276862\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.21935483870967742,\n \"acc_stderr\": 0.02354079935872329,\n \"acc_norm\": 0.21935483870967742,\n \"acc_norm_stderr\": 0.02354079935872329\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.02874898368994107,\n \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.02874898368994107\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.031195840877700314,\n \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.031195840877700314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463196,\n \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463196\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276612,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276612\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23302752293577983,\n \"acc_stderr\": 0.018125669180861496,\n \"acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.018125669180861496\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.03058759135160424,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.03058759135160424\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842562,\n \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842562\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286773,\n \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286773\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3247863247863248,\n \"acc_stderr\": 0.030679022765498828,\n \"acc_norm\": 0.3247863247863248,\n \"acc_norm_stderr\": 0.030679022765498828\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2937420178799489,\n \"acc_stderr\": 0.016287759388491682,\n \"acc_norm\": 0.2937420178799489,\n \"acc_norm_stderr\": 0.016287759388491682\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2837988826815642,\n \"acc_stderr\": 0.015078358970751778,\n \"acc_norm\": 0.2837988826815642,\n \"acc_norm_stderr\": 0.015078358970751778\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.024954184324879912,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.024954184324879912\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2090032154340836,\n \"acc_stderr\": 0.02309314039837422,\n \"acc_norm\": 0.2090032154340836,\n \"acc_norm_stderr\": 0.02309314039837422\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290392,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290392\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23598435462842243,\n \"acc_stderr\": 0.01084480266966268,\n \"acc_norm\": 0.23598435462842243,\n \"acc_norm_stderr\": 0.01084480266966268\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3088235294117647,\n \"acc_stderr\": 0.028064998167040094,\n \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.028064998167040094\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.15918367346938775,\n \"acc_stderr\": 0.02342097206916635,\n \"acc_norm\": 0.15918367346938775,\n \"acc_norm_stderr\": 0.02342097206916635\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156507,\n \"mc2\": 0.4211587968245106,\n \"mc2_stderr\": 0.01485785907132671\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5303867403314917,\n \"acc_stderr\": 0.014026510839428737\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02350265352539803,\n \"acc_stderr\": 0.004172883669643965\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-coder-ds-1.3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|arc:challenge|25_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|gsm8k|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hellaswag|10_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T06-48-01.416618.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["**/details_harness|winogrande|5_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T06-48-01.416618.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T06_48_01.416618", "path": ["results_2023-12-30T06-48-01.416618.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T06-48-01.416618.parquet"]}]}]} | 2023-12-30T06:50:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-1.3b
Dataset automatically created during the evaluation run of model uukuguy/speechless-coder-ds-1.3b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T06:48:01.416618(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-1.3b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-coder-ds-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T06:48:01.416618(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-1.3b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-coder-ds-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T06:48:01.416618(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-1.3b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-coder-ds-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T06:48:01.416618(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
9c81aed5c49e58ba6c7c2bb485d109dfeea0933a |
# Dataset Card for "KULLM-v2"
## Dataset Summary
Korean translation of GPT4ALL, Dolly, and Vicuna data.
repository: [nlpai-lab/KULLM](https://github.com/nlpai-lab/KULLM)
huggingface: [nlpai-lab/kullm-v2](https://huggingface.co/nlpai-lab/kullm-polyglot-12.8b-v2)
#### Translate dataset
Translated 'instruction', 'input', and 'output' in the dataset via the DeepL API
## Lisence
Apache-2.0
```python
>>> from datasets import load_dataset
>>> ds = load_dataset("nlpai-lab/kullm-v2", split="train")
>>> ds
DatasetDict({
train: Dataset({
features: ['id', 'instruction', 'input', 'output'],
num_rows: 152630
})
})
```
```python
>>> ds[0]
{'id': 'alpaca_{idx}',
'instruction': '3원색이란 무엇인가요?',
'input': '',
'output': '세 가지 기본 색은 빨강, 파랑, 노랑입니다. 이 색은 다른 색을 혼합하여 만들 수 없고 다른 모든 색은 다양한 비율로 조합하여 만들 수 있기 때문에 원색이라고 부릅니다. 빛에 사용되는 첨가제 색상 시스템에서 원색은 빨강, 녹색, 파랑(RGB)입니다.'}
``` | csujeong/kullm-v2.1 | [
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:ko",
"license:apache-2.0",
"region:us"
] | 2023-12-30T06:55:21+00:00 | {"language": ["ko"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation"], "pretty_name": "kullm"} | 2023-12-30T06:55:44+00:00 | [] | [
"ko"
] | TAGS
#task_categories-text-generation #size_categories-10K<n<100K #language-Korean #license-apache-2.0 #region-us
|
# Dataset Card for "KULLM-v2"
## Dataset Summary
Korean translation of GPT4ALL, Dolly, and Vicuna data.
repository: nlpai-lab/KULLM
huggingface: nlpai-lab/kullm-v2
#### Translate dataset
Translated 'instruction', 'input', and 'output' in the dataset via the DeepL API
## Lisence
Apache-2.0
| [
"# Dataset Card for \"KULLM-v2\"",
"## Dataset Summary\n\nKorean translation of GPT4ALL, Dolly, and Vicuna data.\n\n\nrepository: nlpai-lab/KULLM\n\nhuggingface: nlpai-lab/kullm-v2",
"#### Translate dataset\n\nTranslated 'instruction', 'input', and 'output' in the dataset via the DeepL API",
"## Lisence\nApache-2.0"
] | [
"TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-Korean #license-apache-2.0 #region-us \n",
"# Dataset Card for \"KULLM-v2\"",
"## Dataset Summary\n\nKorean translation of GPT4ALL, Dolly, and Vicuna data.\n\n\nrepository: nlpai-lab/KULLM\n\nhuggingface: nlpai-lab/kullm-v2",
"#### Translate dataset\n\nTranslated 'instruction', 'input', and 'output' in the dataset via the DeepL API",
"## Lisence\nApache-2.0"
] | [
42,
12,
48,
32,
7
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-Korean #license-apache-2.0 #region-us \n# Dataset Card for \"KULLM-v2\"## Dataset Summary\n\nKorean translation of GPT4ALL, Dolly, and Vicuna data.\n\n\nrepository: nlpai-lab/KULLM\n\nhuggingface: nlpai-lab/kullm-v2#### Translate dataset\n\nTranslated 'instruction', 'input', and 'output' in the dataset via the DeepL API## Lisence\nApache-2.0"
] |
b9f209922552cce9e72c188652d591f54942898e |
# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-6.7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/speechless-coder-ds-6.7b](https://huggingface.co/uukuguy/speechless-coder-ds-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-coder-ds-6.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T07:08:30.796108](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-coder-ds-6.7b/blob/main/results_2023-12-30T07-08-30.796108.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38073989952019327,
"acc_stderr": 0.03433559818958823,
"acc_norm": 0.38307431216916843,
"acc_norm_stderr": 0.0350891686808636,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766373,
"mc2": 0.4167302788975791,
"mc2_stderr": 0.014552137962691033
},
"harness|arc:challenge|25": {
"acc": 0.3378839590443686,
"acc_stderr": 0.013822047922283516,
"acc_norm": 0.36860068259385664,
"acc_norm_stderr": 0.014097810678042185
},
"harness|hellaswag|10": {
"acc": 0.40300736904999,
"acc_stderr": 0.0048949977367190485,
"acc_norm": 0.5245966938856802,
"acc_norm_stderr": 0.004983740145218606
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996794,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996794
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.42641509433962266,
"acc_stderr": 0.030437794342983045,
"acc_norm": 0.42641509433962266,
"acc_norm_stderr": 0.030437794342983045
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3541666666666667,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.3541666666666667,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835361,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835361
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4161290322580645,
"acc_stderr": 0.028040981380761543,
"acc_norm": 0.4161290322580645,
"acc_norm_stderr": 0.028040981380761543
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03681050869161549,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03681050869161549
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.42424242424242425,
"acc_stderr": 0.03521224908841583,
"acc_norm": 0.42424242424242425,
"acc_norm_stderr": 0.03521224908841583
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.39378238341968913,
"acc_stderr": 0.03526077095548237,
"acc_norm": 0.39378238341968913,
"acc_norm_stderr": 0.03526077095548237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35384615384615387,
"acc_stderr": 0.024243783994062164,
"acc_norm": 0.35384615384615387,
"acc_norm_stderr": 0.024243783994062164
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3577981651376147,
"acc_stderr": 0.02055206078482782,
"acc_norm": 0.3577981651376147,
"acc_norm_stderr": 0.02055206078482782
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.03344887382997867,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.03344887382997867
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.03393388584958406,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.03393388584958406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3459915611814346,
"acc_stderr": 0.03096481058878671,
"acc_norm": 0.3459915611814346,
"acc_norm_stderr": 0.03096481058878671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48091603053435117,
"acc_stderr": 0.04382094705550989,
"acc_norm": 0.48091603053435117,
"acc_norm_stderr": 0.04382094705550989
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4380165289256198,
"acc_stderr": 0.045291468044357915,
"acc_norm": 0.4380165289256198,
"acc_norm_stderr": 0.045291468044357915
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.039015918258361836,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.039015918258361836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.03222414045241108,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.03222414045241108
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.40485312899106,
"acc_stderr": 0.017553246467720253,
"acc_norm": 0.40485312899106,
"acc_norm_stderr": 0.017553246467720253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.02622615860512465,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.02622615860512465
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2871508379888268,
"acc_stderr": 0.015131608849963729,
"acc_norm": 0.2871508379888268,
"acc_norm_stderr": 0.015131608849963729
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02791405551046802,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02791405551046802
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3729903536977492,
"acc_stderr": 0.027466610213140105,
"acc_norm": 0.3729903536977492,
"acc_norm_stderr": 0.027466610213140105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.026041766202717167,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.026041766202717167
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.0271871270115038,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.0271871270115038
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30182529335071706,
"acc_stderr": 0.01172435051810589,
"acc_norm": 0.30182529335071706,
"acc_norm_stderr": 0.01172435051810589
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.028418208619406787,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.028418208619406787
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.01869085027359528,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.01869085027359528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.36318407960199006,
"acc_stderr": 0.034005985055990146,
"acc_norm": 0.36318407960199006,
"acc_norm_stderr": 0.034005985055990146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.36257309941520466,
"acc_stderr": 0.036871306155620606,
"acc_norm": 0.36257309941520466,
"acc_norm_stderr": 0.036871306155620606
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766373,
"mc2": 0.4167302788975791,
"mc2_stderr": 0.014552137962691033
},
"harness|winogrande|5": {
"acc": 0.5887924230465666,
"acc_stderr": 0.013829128358676876
},
"harness|gsm8k|5": {
"acc": 0.18726307808946172,
"acc_stderr": 0.010745914199510825
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_uukuguy__speechless-coder-ds-6.7b | [
"region:us"
] | 2023-12-30T07:10:42+00:00 | {"pretty_name": "Evaluation run of uukuguy/speechless-coder-ds-6.7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-coder-ds-6.7b](https://huggingface.co/uukuguy/speechless-coder-ds-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-coder-ds-6.7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T07:08:30.796108](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-coder-ds-6.7b/blob/main/results_2023-12-30T07-08-30.796108.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38073989952019327,\n \"acc_stderr\": 0.03433559818958823,\n \"acc_norm\": 0.38307431216916843,\n \"acc_norm_stderr\": 0.0350891686808636,\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766373,\n \"mc2\": 0.4167302788975791,\n \"mc2_stderr\": 0.014552137962691033\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3378839590443686,\n \"acc_stderr\": 0.013822047922283516,\n \"acc_norm\": 0.36860068259385664,\n \"acc_norm_stderr\": 0.014097810678042185\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.40300736904999,\n \"acc_stderr\": 0.0048949977367190485,\n \"acc_norm\": 0.5245966938856802,\n \"acc_norm_stderr\": 0.004983740145218606\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.04094376269996794,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.04094376269996794\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.42641509433962266,\n \"acc_stderr\": 0.030437794342983045,\n \"acc_norm\": 0.42641509433962266,\n \"acc_norm_stderr\": 0.030437794342983045\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3541666666666667,\n \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.3541666666666667,\n \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.3699421965317919,\n \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835361,\n \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835361\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4161290322580645,\n \"acc_stderr\": 0.028040981380761543,\n \"acc_norm\": 0.4161290322580645,\n \"acc_norm_stderr\": 0.028040981380761543\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03681050869161549,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03681050869161549\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.42424242424242425,\n \"acc_stderr\": 0.03521224908841583,\n \"acc_norm\": 0.42424242424242425,\n \"acc_norm_stderr\": 0.03521224908841583\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.39378238341968913,\n \"acc_stderr\": 0.03526077095548237,\n \"acc_norm\": 0.39378238341968913,\n \"acc_norm_stderr\": 0.03526077095548237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.35384615384615387,\n \"acc_stderr\": 0.024243783994062164,\n \"acc_norm\": 0.35384615384615387,\n \"acc_norm_stderr\": 0.024243783994062164\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3577981651376147,\n \"acc_stderr\": 0.02055206078482782,\n \"acc_norm\": 0.3577981651376147,\n \"acc_norm_stderr\": 0.02055206078482782\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997867,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997867\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.03393388584958406,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.03393388584958406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3459915611814346,\n \"acc_stderr\": 0.03096481058878671,\n \"acc_norm\": 0.3459915611814346,\n \"acc_norm_stderr\": 0.03096481058878671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.35874439461883406,\n \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.48091603053435117,\n \"acc_stderr\": 0.04382094705550989,\n \"acc_norm\": 0.48091603053435117,\n \"acc_norm_stderr\": 0.04382094705550989\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4380165289256198,\n \"acc_stderr\": 0.045291468044357915,\n \"acc_norm\": 0.4380165289256198,\n \"acc_norm_stderr\": 0.045291468044357915\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3611111111111111,\n \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.039015918258361836,\n \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.039015918258361836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.04846748253977239,\n \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.04846748253977239\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.03222414045241108,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.03222414045241108\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.40485312899106,\n \"acc_stderr\": 0.017553246467720253,\n \"acc_norm\": 0.40485312899106,\n \"acc_norm_stderr\": 0.017553246467720253\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3872832369942196,\n \"acc_stderr\": 0.02622615860512465,\n \"acc_norm\": 0.3872832369942196,\n \"acc_norm_stderr\": 0.02622615860512465\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2871508379888268,\n \"acc_stderr\": 0.015131608849963729,\n \"acc_norm\": 0.2871508379888268,\n \"acc_norm_stderr\": 0.015131608849963729\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02791405551046802,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02791405551046802\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3729903536977492,\n \"acc_stderr\": 0.027466610213140105,\n \"acc_norm\": 0.3729903536977492,\n \"acc_norm_stderr\": 0.027466610213140105\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.026041766202717167,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.026041766202717167\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.29432624113475175,\n \"acc_stderr\": 0.0271871270115038,\n \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.0271871270115038\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30182529335071706,\n \"acc_stderr\": 0.01172435051810589,\n \"acc_norm\": 0.30182529335071706,\n \"acc_norm_stderr\": 0.01172435051810589\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.028418208619406787,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.028418208619406787\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3088235294117647,\n \"acc_stderr\": 0.01869085027359528,\n \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.01869085027359528\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.031717528240626645,\n \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.031717528240626645\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.36318407960199006,\n \"acc_stderr\": 0.034005985055990146,\n \"acc_norm\": 0.36318407960199006,\n \"acc_norm_stderr\": 0.034005985055990146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.36257309941520466,\n \"acc_stderr\": 0.036871306155620606,\n \"acc_norm\": 0.36257309941520466,\n \"acc_norm_stderr\": 0.036871306155620606\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766373,\n \"mc2\": 0.4167302788975791,\n \"mc2_stderr\": 0.014552137962691033\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5887924230465666,\n \"acc_stderr\": 0.013829128358676876\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18726307808946172,\n \"acc_stderr\": 0.010745914199510825\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-coder-ds-6.7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|arc:challenge|25_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|gsm8k|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hellaswag|10_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T07-08-30.796108.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["**/details_harness|winogrande|5_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T07-08-30.796108.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T07_08_30.796108", "path": ["results_2023-12-30T07-08-30.796108.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T07-08-30.796108.parquet"]}]}]} | 2023-12-30T07:11:06+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-6.7b
Dataset automatically created during the evaluation run of model uukuguy/speechless-coder-ds-6.7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T07:08:30.796108(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-6.7b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-coder-ds-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T07:08:30.796108(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-6.7b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-coder-ds-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T07:08:30.796108(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-6.7b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-coder-ds-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T07:08:30.796108(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
960b097a91d008c098c41c097bf1fbf5b07a7a91 | fs | taenz9r/fs | [
"region:us"
] | 2023-12-30T07:20:08+00:00 | {} | 2023-12-30T07:20:23+00:00 | [] | [] | TAGS
#region-us
| fs | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
34aff85b1b66545603d8ffd397d7fb17f65ad006 | # Rhino Dataset
The Rhino dataset is a comprehensive instruction-following dataset designed to train a highly performant language model named RhinoBeetle. This dataset aims to combine both quality and quantity to facilitate robust machine learning applications.
## Construction Blueprint
To create the Rhino dataset, a collection of diverse datasets will be concatenated to form the initial raw data. These source datasets include:
- LDJnr/Verified-Camel
- glaiveai/glaive-code-assistant-v2
- LDJnr/Pure-Dove
- meta-math/MetaMathQA
- VMware/open-instruct
- TIGER-Lab/MathInstruct
- LDJnr/Capybara
- OpenOrca GPT-4
After the initial concatenation, the dataset will be subjected to a basic cleaning process to exclude any examples containing Reinforcement Learning from Human Feedback (RLHF) refusals.
## Quality Scoring and Selection
A regression model, such as Tiny Llama for sequence classification, will be employed to evaluate each example in the raw Rhino data based on quality. This model will be trained on a curated version of the Nectar dataset.
The scoring of examples in the Nectar dataset is calculated using the following function:
Given a list of answers \( A \) and a randomly chosen answer \( a \in A \), the score \( S \) for the chosen answer \( a \) is calculated as:
\[ S = 1 - \frac{\text{index}(a)}{|A|} \]
where \( \text{index}(a) \) is the position of the randomly chosen answer \( a \) in the list \( A \), and \( |A| \) is the total number of answers in the list.
The sequence classification model will use this scoring function to assess the raw Rhino data and determine which examples to retain or discard based on their assessed quality.
## Licensing Considerations
It is crucial to note that the datasets incorporated into the Rhino dataset may be subject to strict licensing agreements. Prior to using the Rhino dataset, one must thoroughly review and comply with the licensing terms of the constituent datasets.
## Contributions
Thanks to @jantxu, @jtatman, and @Locutusque for their contributions to this dataset.
## Usage
We recommend skipping examples with a quality score that is less than 0.05-0.1 while fine-tuning a language model. | M4-ai/Rhino | [
"task_categories:text-generation",
"task_categories:conversational",
"task_categories:question-answering",
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"region:us"
] | 2023-12-30T08:36:59+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "conversational", "question-answering"]} | 2024-01-14T18:23:56+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-conversational #task_categories-question-answering #size_categories-1M<n<10M #language-English #license-apache-2.0 #region-us
| # Rhino Dataset
The Rhino dataset is a comprehensive instruction-following dataset designed to train a highly performant language model named RhinoBeetle. This dataset aims to combine both quality and quantity to facilitate robust machine learning applications.
## Construction Blueprint
To create the Rhino dataset, a collection of diverse datasets will be concatenated to form the initial raw data. These source datasets include:
- LDJnr/Verified-Camel
- glaiveai/glaive-code-assistant-v2
- LDJnr/Pure-Dove
- meta-math/MetaMathQA
- VMware/open-instruct
- TIGER-Lab/MathInstruct
- LDJnr/Capybara
- OpenOrca GPT-4
After the initial concatenation, the dataset will be subjected to a basic cleaning process to exclude any examples containing Reinforcement Learning from Human Feedback (RLHF) refusals.
## Quality Scoring and Selection
A regression model, such as Tiny Llama for sequence classification, will be employed to evaluate each example in the raw Rhino data based on quality. This model will be trained on a curated version of the Nectar dataset.
The scoring of examples in the Nectar dataset is calculated using the following function:
Given a list of answers \( A \) and a randomly chosen answer \( a \in A \), the score \( S \) for the chosen answer \( a \) is calculated as:
\[ S = 1 - \frac{\text{index}(a)}{|A|} \]
where \( \text{index}(a) \) is the position of the randomly chosen answer \( a \) in the list \( A \), and \( |A| \) is the total number of answers in the list.
The sequence classification model will use this scoring function to assess the raw Rhino data and determine which examples to retain or discard based on their assessed quality.
## Licensing Considerations
It is crucial to note that the datasets incorporated into the Rhino dataset may be subject to strict licensing agreements. Prior to using the Rhino dataset, one must thoroughly review and comply with the licensing terms of the constituent datasets.
## Contributions
Thanks to @jantxu, @jtatman, and @Locutusque for their contributions to this dataset.
## Usage
We recommend skipping examples with a quality score that is less than 0.05-0.1 while fine-tuning a language model. | [
"# Rhino Dataset\nThe Rhino dataset is a comprehensive instruction-following dataset designed to train a highly performant language model named RhinoBeetle. This dataset aims to combine both quality and quantity to facilitate robust machine learning applications.",
"## Construction Blueprint\nTo create the Rhino dataset, a collection of diverse datasets will be concatenated to form the initial raw data. These source datasets include:\n\n- LDJnr/Verified-Camel\n- glaiveai/glaive-code-assistant-v2\n- LDJnr/Pure-Dove\n- meta-math/MetaMathQA\n- VMware/open-instruct\n- TIGER-Lab/MathInstruct\n- LDJnr/Capybara\n- OpenOrca GPT-4\n\nAfter the initial concatenation, the dataset will be subjected to a basic cleaning process to exclude any examples containing Reinforcement Learning from Human Feedback (RLHF) refusals.",
"## Quality Scoring and Selection\nA regression model, such as Tiny Llama for sequence classification, will be employed to evaluate each example in the raw Rhino data based on quality. This model will be trained on a curated version of the Nectar dataset.\n\nThe scoring of examples in the Nectar dataset is calculated using the following function:\n\nGiven a list of answers \\( A \\) and a randomly chosen answer \\( a \\in A \\), the score \\( S \\) for the chosen answer \\( a \\) is calculated as:\n\n\\[ S = 1 - \\frac{\\text{index}(a)}{|A|} \\]\n\nwhere \\( \\text{index}(a) \\) is the position of the randomly chosen answer \\( a \\) in the list \\( A \\), and \\( |A| \\) is the total number of answers in the list.\n\nThe sequence classification model will use this scoring function to assess the raw Rhino data and determine which examples to retain or discard based on their assessed quality.",
"## Licensing Considerations\nIt is crucial to note that the datasets incorporated into the Rhino dataset may be subject to strict licensing agreements. Prior to using the Rhino dataset, one must thoroughly review and comply with the licensing terms of the constituent datasets.",
"## Contributions\nThanks to @jantxu, @jtatman, and @Locutusque for their contributions to this dataset.",
"## Usage\nWe recommend skipping examples with a quality score that is less than 0.05-0.1 while fine-tuning a language model."
] | [
"TAGS\n#task_categories-text-generation #task_categories-conversational #task_categories-question-answering #size_categories-1M<n<10M #language-English #license-apache-2.0 #region-us \n",
"# Rhino Dataset\nThe Rhino dataset is a comprehensive instruction-following dataset designed to train a highly performant language model named RhinoBeetle. This dataset aims to combine both quality and quantity to facilitate robust machine learning applications.",
"## Construction Blueprint\nTo create the Rhino dataset, a collection of diverse datasets will be concatenated to form the initial raw data. These source datasets include:\n\n- LDJnr/Verified-Camel\n- glaiveai/glaive-code-assistant-v2\n- LDJnr/Pure-Dove\n- meta-math/MetaMathQA\n- VMware/open-instruct\n- TIGER-Lab/MathInstruct\n- LDJnr/Capybara\n- OpenOrca GPT-4\n\nAfter the initial concatenation, the dataset will be subjected to a basic cleaning process to exclude any examples containing Reinforcement Learning from Human Feedback (RLHF) refusals.",
"## Quality Scoring and Selection\nA regression model, such as Tiny Llama for sequence classification, will be employed to evaluate each example in the raw Rhino data based on quality. This model will be trained on a curated version of the Nectar dataset.\n\nThe scoring of examples in the Nectar dataset is calculated using the following function:\n\nGiven a list of answers \\( A \\) and a randomly chosen answer \\( a \\in A \\), the score \\( S \\) for the chosen answer \\( a \\) is calculated as:\n\n\\[ S = 1 - \\frac{\\text{index}(a)}{|A|} \\]\n\nwhere \\( \\text{index}(a) \\) is the position of the randomly chosen answer \\( a \\) in the list \\( A \\), and \\( |A| \\) is the total number of answers in the list.\n\nThe sequence classification model will use this scoring function to assess the raw Rhino data and determine which examples to retain or discard based on their assessed quality.",
"## Licensing Considerations\nIt is crucial to note that the datasets incorporated into the Rhino dataset may be subject to strict licensing agreements. Prior to using the Rhino dataset, one must thoroughly review and comply with the licensing terms of the constituent datasets.",
"## Contributions\nThanks to @jantxu, @jtatman, and @Locutusque for their contributions to this dataset.",
"## Usage\nWe recommend skipping examples with a quality score that is less than 0.05-0.1 while fine-tuning a language model."
] | [
63,
55,
157,
271,
65,
30,
29
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-conversational #task_categories-question-answering #size_categories-1M<n<10M #language-English #license-apache-2.0 #region-us \n# Rhino Dataset\nThe Rhino dataset is a comprehensive instruction-following dataset designed to train a highly performant language model named RhinoBeetle. This dataset aims to combine both quality and quantity to facilitate robust machine learning applications.## Construction Blueprint\nTo create the Rhino dataset, a collection of diverse datasets will be concatenated to form the initial raw data. These source datasets include:\n\n- LDJnr/Verified-Camel\n- glaiveai/glaive-code-assistant-v2\n- LDJnr/Pure-Dove\n- meta-math/MetaMathQA\n- VMware/open-instruct\n- TIGER-Lab/MathInstruct\n- LDJnr/Capybara\n- OpenOrca GPT-4\n\nAfter the initial concatenation, the dataset will be subjected to a basic cleaning process to exclude any examples containing Reinforcement Learning from Human Feedback (RLHF) refusals."
] |
08755e97e7919a295f634f231fa424dbef123005 |
# Dataset Card for Evaluation run of SUSTech/SUS-Chat-72B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SUSTech/SUS-Chat-72B](https://huggingface.co/SUSTech/SUS-Chat-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SUSTech__SUS-Chat-72B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T08:38:52.255652](https://huggingface.co/datasets/open-llm-leaderboard/details_SUSTech__SUS-Chat-72B/blob/main/results_2023-12-30T08-38-52.255652.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7531471665521513,
"acc_stderr": 0.028005234629175594,
"acc_norm": 0.7666170688561996,
"acc_norm_stderr": 0.028617434882601496,
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6026834780213507,
"mc2_stderr": 0.014913414941903928
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955002,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902274
},
"harness|hellaswag|10": {
"acc": 0.6585341565425215,
"acc_stderr": 0.004732322172153752,
"acc_norm": 0.849631547500498,
"acc_norm_stderr": 0.0035670171422264854
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.725925925925926,
"acc_stderr": 0.038532548365520045,
"acc_norm": 0.725925925925926,
"acc_norm_stderr": 0.038532548365520045
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.868421052631579,
"acc_stderr": 0.027508689533549915,
"acc_norm": 0.868421052631579,
"acc_norm_stderr": 0.027508689533549915
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8301886792452831,
"acc_stderr": 0.02310839379984133,
"acc_norm": 0.8301886792452831,
"acc_norm_stderr": 0.02310839379984133
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7872340425531915,
"acc_stderr": 0.02675439134803977,
"acc_norm": 0.7872340425531915,
"acc_norm_stderr": 0.02675439134803977
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8,
"acc_stderr": 0.03333333333333329,
"acc_norm": 0.8,
"acc_norm_stderr": 0.03333333333333329
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.671957671957672,
"acc_stderr": 0.024180497164376896,
"acc_norm": 0.671957671957672,
"acc_norm_stderr": 0.024180497164376896
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8870967741935484,
"acc_stderr": 0.01800360332586361,
"acc_norm": 0.8870967741935484,
"acc_norm_stderr": 0.01800360332586361
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6798029556650246,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.6798029556650246,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066584,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066584
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019951,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019951
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909046,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909046
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.019457390787681786,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.019457390787681786
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.030343862998512636,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.030343862998512636
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.023005459446673957,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.023005459446673957
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5629139072847682,
"acc_stderr": 0.040500357222306355,
"acc_norm": 0.5629139072847682,
"acc_norm_stderr": 0.040500357222306355
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9247706422018349,
"acc_stderr": 0.011308662537571746,
"acc_norm": 0.9247706422018349,
"acc_norm_stderr": 0.011308662537571746
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.032568505702936464,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.032568505702936464
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640266,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640266
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407256,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407256
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622814,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622814
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.031457038543062504,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.031457038543062504
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339653,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339653
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446908,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446908
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9233716475095786,
"acc_stderr": 0.00951217069932386,
"acc_norm": 0.9233716475095786,
"acc_norm_stderr": 0.00951217069932386
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8323699421965318,
"acc_stderr": 0.020110579919734847,
"acc_norm": 0.8323699421965318,
"acc_norm_stderr": 0.020110579919734847
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6022346368715084,
"acc_stderr": 0.01636920497126299,
"acc_norm": 0.6022346368715084,
"acc_norm_stderr": 0.01636920497126299
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.019704039183859812,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.019704039183859812
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8263665594855305,
"acc_stderr": 0.021514051585970393,
"acc_norm": 0.8263665594855305,
"acc_norm_stderr": 0.021514051585970393
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.018689725721062065,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.018689725721062065
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6524822695035462,
"acc_stderr": 0.02840662780959095,
"acc_norm": 0.6524822695035462,
"acc_norm_stderr": 0.02840662780959095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6121251629726207,
"acc_stderr": 0.012444998309675633,
"acc_norm": 0.6121251629726207,
"acc_norm_stderr": 0.012444998309675633
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8345588235294118,
"acc_stderr": 0.022571771025494743,
"acc_norm": 0.8345588235294118,
"acc_norm_stderr": 0.022571771025494743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8120915032679739,
"acc_stderr": 0.01580356573677669,
"acc_norm": 0.8120915032679739,
"acc_norm_stderr": 0.01580356573677669
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7909090909090909,
"acc_stderr": 0.03895091015724136,
"acc_norm": 0.7909090909090909,
"acc_norm_stderr": 0.03895091015724136
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.0250002560395462,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.0250002560395462
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.0211662163046594,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.0211662163046594
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.02190429135575904,
"acc_norm": 0.95,
"acc_norm_stderr": 0.02190429135575904
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44063647490820074,
"mc1_stderr": 0.017379697555437446,
"mc2": 0.6026834780213507,
"mc2_stderr": 0.014913414941903928
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370637
},
"harness|gsm8k|5": {
"acc": 0.09401061410159212,
"acc_stderr": 0.008038819818872465
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_SUSTech__SUS-Chat-72B | [
"region:us"
] | 2023-12-30T08:40:59+00:00 | {"pretty_name": "Evaluation run of SUSTech/SUS-Chat-72B", "dataset_summary": "Dataset automatically created during the evaluation run of model [SUSTech/SUS-Chat-72B](https://huggingface.co/SUSTech/SUS-Chat-72B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SUSTech__SUS-Chat-72B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T08:38:52.255652](https://huggingface.co/datasets/open-llm-leaderboard/details_SUSTech__SUS-Chat-72B/blob/main/results_2023-12-30T08-38-52.255652.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7531471665521513,\n \"acc_stderr\": 0.028005234629175594,\n \"acc_norm\": 0.7666170688561996,\n \"acc_norm_stderr\": 0.028617434882601496,\n \"mc1\": 0.44063647490820074,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6026834780213507,\n \"mc2_stderr\": 0.014913414941903928\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955002,\n \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902274\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6585341565425215,\n \"acc_stderr\": 0.004732322172153752,\n \"acc_norm\": 0.849631547500498,\n \"acc_norm_stderr\": 0.0035670171422264854\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.725925925925926,\n \"acc_stderr\": 0.038532548365520045,\n \"acc_norm\": 0.725925925925926,\n \"acc_norm_stderr\": 0.038532548365520045\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.027508689533549915,\n \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.027508689533549915\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8301886792452831,\n \"acc_stderr\": 0.02310839379984133,\n \"acc_norm\": 0.8301886792452831,\n \"acc_norm_stderr\": 0.02310839379984133\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7872340425531915,\n \"acc_stderr\": 0.02675439134803977,\n \"acc_norm\": 0.7872340425531915,\n \"acc_norm_stderr\": 0.02675439134803977\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.03333333333333329,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.03333333333333329\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.671957671957672,\n \"acc_stderr\": 0.024180497164376896,\n \"acc_norm\": 0.671957671957672,\n \"acc_norm_stderr\": 0.024180497164376896\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8870967741935484,\n \"acc_stderr\": 0.01800360332586361,\n \"acc_norm\": 0.8870967741935484,\n \"acc_norm_stderr\": 0.01800360332586361\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6798029556650246,\n \"acc_stderr\": 0.032826493853041504,\n \"acc_norm\": 0.6798029556650246,\n \"acc_norm_stderr\": 0.032826493853041504\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019951,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019951\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909046,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909046\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.019457390787681786,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.019457390787681786\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.030343862998512636,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.030343862998512636\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.023005459446673957,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.023005459446673957\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5629139072847682,\n \"acc_stderr\": 0.040500357222306355,\n \"acc_norm\": 0.5629139072847682,\n \"acc_norm_stderr\": 0.040500357222306355\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9247706422018349,\n \"acc_stderr\": 0.011308662537571746,\n \"acc_norm\": 0.9247706422018349,\n \"acc_norm_stderr\": 0.011308662537571746\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.032568505702936464,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.032568505702936464\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640266,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640266\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.026241132996407256,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.026241132996407256\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.031457038543062504,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.031457038543062504\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339653,\n \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339653\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.015006312806446908,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.015006312806446908\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9233716475095786,\n \"acc_stderr\": 0.00951217069932386,\n \"acc_norm\": 0.9233716475095786,\n \"acc_norm_stderr\": 0.00951217069932386\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8323699421965318,\n \"acc_stderr\": 0.020110579919734847,\n \"acc_norm\": 0.8323699421965318,\n \"acc_norm_stderr\": 0.020110579919734847\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6022346368715084,\n \"acc_stderr\": 0.01636920497126299,\n \"acc_norm\": 0.6022346368715084,\n \"acc_norm_stderr\": 0.01636920497126299\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.019704039183859812,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.019704039183859812\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8263665594855305,\n \"acc_stderr\": 0.021514051585970393,\n \"acc_norm\": 0.8263665594855305,\n \"acc_norm_stderr\": 0.021514051585970393\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8703703703703703,\n \"acc_stderr\": 0.018689725721062065,\n \"acc_norm\": 0.8703703703703703,\n \"acc_norm_stderr\": 0.018689725721062065\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6524822695035462,\n \"acc_stderr\": 0.02840662780959095,\n \"acc_norm\": 0.6524822695035462,\n \"acc_norm_stderr\": 0.02840662780959095\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6121251629726207,\n \"acc_stderr\": 0.012444998309675633,\n \"acc_norm\": 0.6121251629726207,\n \"acc_norm_stderr\": 0.012444998309675633\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.022571771025494743,\n \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.022571771025494743\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8120915032679739,\n \"acc_stderr\": 0.01580356573677669,\n \"acc_norm\": 0.8120915032679739,\n \"acc_norm_stderr\": 0.01580356573677669\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7909090909090909,\n \"acc_stderr\": 0.03895091015724136,\n \"acc_norm\": 0.7909090909090909,\n \"acc_norm_stderr\": 0.03895091015724136\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.0250002560395462,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.0250002560395462\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.0211662163046594,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.0211662163046594\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.95,\n \"acc_stderr\": 0.02190429135575904,\n \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.02190429135575904\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44063647490820074,\n \"mc1_stderr\": 0.017379697555437446,\n \"mc2\": 0.6026834780213507,\n \"mc2_stderr\": 0.014913414941903928\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370637\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09401061410159212,\n \"acc_stderr\": 0.008038819818872465\n }\n}\n```", "repo_url": "https://huggingface.co/SUSTech/SUS-Chat-72B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|arc:challenge|25_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|gsm8k|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hellaswag|10_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T08-38-52.255652.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["**/details_harness|winogrande|5_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T08-38-52.255652.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T08_38_52.255652", "path": ["results_2023-12-30T08-38-52.255652.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T08-38-52.255652.parquet"]}]}]} | 2023-12-30T08:41:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of SUSTech/SUS-Chat-72B
Dataset automatically created during the evaluation run of model SUSTech/SUS-Chat-72B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T08:38:52.255652(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of SUSTech/SUS-Chat-72B\n\n\n\nDataset automatically created during the evaluation run of model SUSTech/SUS-Chat-72B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T08:38:52.255652(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of SUSTech/SUS-Chat-72B\n\n\n\nDataset automatically created during the evaluation run of model SUSTech/SUS-Chat-72B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T08:38:52.255652(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of SUSTech/SUS-Chat-72B\n\n\n\nDataset automatically created during the evaluation run of model SUSTech/SUS-Chat-72B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T08:38:52.255652(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
6f758e7e0b19bcecbced05d3558df5ac7eedd15b | # Dataset Card for "weibo_senti_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jay401521/weibo_senti_train | [
"region:us"
] | 2023-12-30T09:13:16+00:00 | {"dataset_info": {"features": [{"name": "label", "dtype": "int64"}, {"name": "review", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17338582, "num_examples": 99988}], "download_size": 13189738, "dataset_size": 17338582}} | 2023-12-30T09:13:21+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "weibo_senti_train"
More Information needed | [
"# Dataset Card for \"weibo_senti_train\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"weibo_senti_train\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"weibo_senti_train\"\n\nMore Information needed"
] |
e3d7bec9f2d894a77946d8b3d113d8c2b1718651 | # Dataset Card for "weibo_senti_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jay401521/weibo_senti_test | [
"region:us"
] | 2023-12-30T09:14:08+00:00 | {"dataset_info": {"features": [{"name": "label", "dtype": "int64"}, {"name": "review", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3433361, "num_examples": 20000}], "download_size": 2608855, "dataset_size": 3433361}} | 2023-12-30T09:14:13+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "weibo_senti_test"
More Information needed | [
"# Dataset Card for \"weibo_senti_test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"weibo_senti_test\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"weibo_senti_test\"\n\nMore Information needed"
] |
f51b050b72f6ce2afc96e19049a815dbf641c1db | # Dataset Card for "gpt2-124M-qlora-chat-support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ASDFD23/gpt2-124M-qlora-chat-support | [
"region:us"
] | 2023-12-30T09:59:27+00:00 | {"dataset_info": {"features": [{"name": "answer", "dtype": "string"}, {"name": "question", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17924, "num_examples": 79}], "download_size": 9896, "dataset_size": 17924}} | 2023-12-30T09:59:29+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "gpt2-124M-qlora-chat-support"
More Information needed | [
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] |
6e6c325447fff5a9555eef340f69e4bb70618978 | # Dataset Card for "gpt2-124M-qlora-chat-support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Mudassir135/gpt2-124M-qlora-chat-support | [
"region:us"
] | 2023-12-30T09:59:42+00:00 | {"dataset_info": {"features": [{"name": "answer", "dtype": "string"}, {"name": "question", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17924, "num_examples": 79}], "download_size": 9896, "dataset_size": 17924}} | 2023-12-30T09:59:43+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "gpt2-124M-qlora-chat-support"
More Information needed | [
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] |
6f5abbd896e8d714f03e9871545af233680f3839 | # Dataset Card for "gpt2-124M-qlora-chat-support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mwasif86/gpt2-124M-qlora-chat-support | [
"region:us"
] | 2023-12-30T10:00:27+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17924, "num_examples": 79}], "download_size": 9894, "dataset_size": 17924}} | 2023-12-30T10:00:29+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "gpt2-124M-qlora-chat-support"
More Information needed | [
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] |
eee615d7ba33d21b466acc992afaedf750a6f417 | # Dataset Card for "gpt2-124M-qlora-chat-support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | atifss/gpt2-124M-qlora-chat-support | [
"region:us"
] | 2023-12-30T10:00:45+00:00 | {"dataset_info": {"features": [{"name": "answer", "dtype": "string"}, {"name": "question", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17924, "num_examples": 79}], "download_size": 9896, "dataset_size": 17924}} | 2023-12-30T10:00:48+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "gpt2-124M-qlora-chat-support"
More Information needed | [
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] |
7a236db2b88b4f92e24a7e7cec660e5aac59e1c1 | # Dataset Card for "gpt2-124M-qlora-chat-support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mouaaz-ranjha/gpt2-124M-qlora-chat-support | [
"region:us"
] | 2023-12-30T10:00:49+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17924, "num_examples": 79}], "download_size": 9894, "dataset_size": 17924}} | 2023-12-30T10:04:03+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "gpt2-124M-qlora-chat-support"
More Information needed | [
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] |
05f2801a3ae5b696f1a7ab661594b126c8e6b009 | # Dataset Card for "gpt2-124M-qlora-chat-support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Imran263/gpt2-124M-qlora-chat-support | [
"region:us"
] | 2023-12-30T10:00:49+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17924, "num_examples": 79}], "download_size": 9894, "dataset_size": 17924}} | 2023-12-30T10:00:53+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "gpt2-124M-qlora-chat-support"
More Information needed | [
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] |
e9cb0d94d468f35b3f1425bed7b58150abf919e5 | # Dataset Card for "NewGPT2Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | meerlubna/NewGPT2Dataset | [
"region:us"
] | 2023-12-30T10:00:53+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17924, "num_examples": 79}], "download_size": 9894, "dataset_size": 17924}} | 2023-12-30T10:19:59+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "NewGPT2Dataset"
More Information needed | [
"# Dataset Card for \"NewGPT2Dataset\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"NewGPT2Dataset\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"NewGPT2Dataset\"\n\nMore Information needed"
] |
f4569b5f20fc22f42acf30bafe6a36412d18a215 | # Dataset Card for "gpt2-124M-qlora-chat-support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mmajbaig/gpt2-124M-qlora-chat-support | [
"region:us"
] | 2023-12-30T10:01:06+00:00 | {"dataset_info": {"features": [{"name": "answer", "dtype": "string"}, {"name": "question", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17924, "num_examples": 79}], "download_size": 9896, "dataset_size": 17924}} | 2023-12-30T10:01:07+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "gpt2-124M-qlora-chat-support"
More Information needed | [
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"gpt2-124M-qlora-chat-support\"\n\nMore Information needed"
] |
3c133aad4d72b754f671843dea7501187a7fb797 |
## Dataset Description
- **Website:** https://openaiwatch.com
- **License:** MIT
- **Language(s) (NLP):** English
### Dataset Summary
The OpenAIWatch dataset is designed to monitor the performance of OpenAI language models, including GPT-3.5-Turbo, GPT-4, and GPT-4-Turbo, over time. This dataset is generated by prompting these models hourly with the phrase "Draw a unicorn in TikZ:" and using greedy decoding (temperature 0). This approach aims to assess the consistency of model responses, and the dataset documents variations in these responses. The target is four requests per model per hour, though actual data may vary due to intermittent request failures.
### Data Fields
- `timestamp` (timestamp): The UTC timestamp of each request.
- `model` (string): The model used for each request, such as gpt-3.5-turbo or gpt-4.
- `raw_response` (string): The direct response from the OpenAI API.
- `tikz_code` (string|None): The extracted TikZ code, identified using the regex pattern \\begin{tikzpicture}.*\\end{tikzpicture}, or None if no match is found.
### Example Findings
For specific insights derived from the dataset, refer to this Twitter post: https://twitter.com/yuntiandeng/status/1682066606044635136. The tweet discusses observable trends post the June update of GPT-4, showing a noticeable shift in the quality of 'unicorn drawings' generated by GPT-3.5 and GPT-4. It compares the performance of these models before and after the update, suggesting an improvement in GPT-3.5's outputs while noting a decline in GPT-4's performance at this specific task. | yuntian-deng/openaiwatch | [
"license:mit",
"region:us"
] | 2023-12-30T10:05:19+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "timestamp", "dtype": "timestamp[s, tz=UTC]"}, {"name": "model", "dtype": "string"}, {"name": "raw_response", "dtype": "string"}, {"name": "tikz_code", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 410038774.0, "num_examples": 74789}], "download_size": 8275038, "dataset_size": 410038774.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-30T17:10:26+00:00 | [] | [] | TAGS
#license-mit #region-us
|
## Dataset Description
- Website: URL
- License: MIT
- Language(s) (NLP): English
### Dataset Summary
The OpenAIWatch dataset is designed to monitor the performance of OpenAI language models, including GPT-3.5-Turbo, GPT-4, and GPT-4-Turbo, over time. This dataset is generated by prompting these models hourly with the phrase "Draw a unicorn in TikZ:" and using greedy decoding (temperature 0). This approach aims to assess the consistency of model responses, and the dataset documents variations in these responses. The target is four requests per model per hour, though actual data may vary due to intermittent request failures.
### Data Fields
- 'timestamp' (timestamp): The UTC timestamp of each request.
- 'model' (string): The model used for each request, such as gpt-3.5-turbo or gpt-4.
- 'raw_response' (string): The direct response from the OpenAI API.
- 'tikz_code' (string|None): The extracted TikZ code, identified using the regex pattern \\begin{tikzpicture}.*\\end{tikzpicture}, or None if no match is found.
### Example Findings
For specific insights derived from the dataset, refer to this Twitter post: URL The tweet discusses observable trends post the June update of GPT-4, showing a noticeable shift in the quality of 'unicorn drawings' generated by GPT-3.5 and GPT-4. It compares the performance of these models before and after the update, suggesting an improvement in GPT-3.5's outputs while noting a decline in GPT-4's performance at this specific task. | [
"## Dataset Description\n \n- Website: URL\n\n- License: MIT\n\n- Language(s) (NLP): English",
"### Dataset Summary\n\nThe OpenAIWatch dataset is designed to monitor the performance of OpenAI language models, including GPT-3.5-Turbo, GPT-4, and GPT-4-Turbo, over time. This dataset is generated by prompting these models hourly with the phrase \"Draw a unicorn in TikZ:\" and using greedy decoding (temperature 0). This approach aims to assess the consistency of model responses, and the dataset documents variations in these responses. The target is four requests per model per hour, though actual data may vary due to intermittent request failures.",
"### Data Fields\n\n- 'timestamp' (timestamp): The UTC timestamp of each request.\n- 'model' (string): The model used for each request, such as gpt-3.5-turbo or gpt-4.\n- 'raw_response' (string): The direct response from the OpenAI API.\n- 'tikz_code' (string|None): The extracted TikZ code, identified using the regex pattern \\\\begin{tikzpicture}.*\\\\end{tikzpicture}, or None if no match is found.",
"### Example Findings\n\nFor specific insights derived from the dataset, refer to this Twitter post: URL The tweet discusses observable trends post the June update of GPT-4, showing a noticeable shift in the quality of 'unicorn drawings' generated by GPT-3.5 and GPT-4. It compares the performance of these models before and after the update, suggesting an improvement in GPT-3.5's outputs while noting a decline in GPT-4's performance at this specific task."
] | [
"TAGS\n#license-mit #region-us \n",
"## Dataset Description\n \n- Website: URL\n\n- License: MIT\n\n- Language(s) (NLP): English",
"### Dataset Summary\n\nThe OpenAIWatch dataset is designed to monitor the performance of OpenAI language models, including GPT-3.5-Turbo, GPT-4, and GPT-4-Turbo, over time. This dataset is generated by prompting these models hourly with the phrase \"Draw a unicorn in TikZ:\" and using greedy decoding (temperature 0). This approach aims to assess the consistency of model responses, and the dataset documents variations in these responses. The target is four requests per model per hour, though actual data may vary due to intermittent request failures.",
"### Data Fields\n\n- 'timestamp' (timestamp): The UTC timestamp of each request.\n- 'model' (string): The model used for each request, such as gpt-3.5-turbo or gpt-4.\n- 'raw_response' (string): The direct response from the OpenAI API.\n- 'tikz_code' (string|None): The extracted TikZ code, identified using the regex pattern \\\\begin{tikzpicture}.*\\\\end{tikzpicture}, or None if no match is found.",
"### Example Findings\n\nFor specific insights derived from the dataset, refer to this Twitter post: URL The tweet discusses observable trends post the June update of GPT-4, showing a noticeable shift in the quality of 'unicorn drawings' generated by GPT-3.5 and GPT-4. It compares the performance of these models before and after the update, suggesting an improvement in GPT-3.5's outputs while noting a decline in GPT-4's performance at this specific task."
] | [
11,
22,
140,
127,
113
] | [
"passage: TAGS\n#license-mit #region-us \n## Dataset Description\n \n- Website: URL\n\n- License: MIT\n\n- Language(s) (NLP): English### Dataset Summary\n\nThe OpenAIWatch dataset is designed to monitor the performance of OpenAI language models, including GPT-3.5-Turbo, GPT-4, and GPT-4-Turbo, over time. This dataset is generated by prompting these models hourly with the phrase \"Draw a unicorn in TikZ:\" and using greedy decoding (temperature 0). This approach aims to assess the consistency of model responses, and the dataset documents variations in these responses. The target is four requests per model per hour, though actual data may vary due to intermittent request failures.### Data Fields\n\n- 'timestamp' (timestamp): The UTC timestamp of each request.\n- 'model' (string): The model used for each request, such as gpt-3.5-turbo or gpt-4.\n- 'raw_response' (string): The direct response from the OpenAI API.\n- 'tikz_code' (string|None): The extracted TikZ code, identified using the regex pattern \\\\begin{tikzpicture}.*\\\\end{tikzpicture}, or None if no match is found.### Example Findings\n\nFor specific insights derived from the dataset, refer to this Twitter post: URL The tweet discusses observable trends post the June update of GPT-4, showing a noticeable shift in the quality of 'unicorn drawings' generated by GPT-3.5 and GPT-4. It compares the performance of these models before and after the update, suggesting an improvement in GPT-3.5's outputs while noting a decline in GPT-4's performance at this specific task."
] |
bb9bb9058a0a95312fad0fc2b6a239417daf91b2 |
# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kekmodel/StopCarbon-10.7B-v1](https://huggingface.co/kekmodel/StopCarbon-10.7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T10:11:02.213915](https://huggingface.co/datasets/open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v1/blob/main/results_2023-12-30T10-11-02.213915.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.666816860137451,
"acc_stderr": 0.031599000003369335,
"acc_norm": 0.6677161238426604,
"acc_norm_stderr": 0.03224247655400503,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7170820046876212,
"mc2_stderr": 0.014999711441421053
},
"harness|arc:challenge|25": {
"acc": 0.6825938566552902,
"acc_stderr": 0.013602239088038167,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907595
},
"harness|hellaswag|10": {
"acc": 0.7123083051185023,
"acc_stderr": 0.004517614647703243,
"acc_norm": 0.8840868352917746,
"acc_norm_stderr": 0.003194665266078602
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.02574806587167328,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.02574806587167328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.02921354941437217,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.02921354941437217
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657569,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657569
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39553072625698327,
"acc_stderr": 0.01635341541007577,
"acc_norm": 0.39553072625698327,
"acc_norm_stderr": 0.01635341541007577
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4921773142112125,
"acc_stderr": 0.0127686730761119,
"acc_norm": 0.4921773142112125,
"acc_norm_stderr": 0.0127686730761119
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.02655651947004151,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.02655651947004151
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.01882421951270621,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.01882421951270621
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7170820046876212,
"mc2_stderr": 0.014999711441421053
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343331
},
"harness|gsm8k|5": {
"acc": 0.6413949962092494,
"acc_stderr": 0.013210317364134033
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v1 | [
"region:us"
] | 2023-12-30T10:13:18+00:00 | {"pretty_name": "Evaluation run of kekmodel/StopCarbon-10.7B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [kekmodel/StopCarbon-10.7B-v1](https://huggingface.co/kekmodel/StopCarbon-10.7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T10:11:02.213915](https://huggingface.co/datasets/open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v1/blob/main/results_2023-12-30T10-11-02.213915.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.666816860137451,\n \"acc_stderr\": 0.031599000003369335,\n \"acc_norm\": 0.6677161238426604,\n \"acc_norm_stderr\": 0.03224247655400503,\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7170820046876212,\n \"mc2_stderr\": 0.014999711441421053\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6825938566552902,\n \"acc_stderr\": 0.013602239088038167,\n \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907595\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7123083051185023,\n \"acc_stderr\": 0.004517614647703243,\n \"acc_norm\": 0.8840868352917746,\n \"acc_norm_stderr\": 0.003194665266078602\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.02574806587167328,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.02574806587167328\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.02921354941437217,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.02921354941437217\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n \"acc_stderr\": 0.014143970276657569,\n \"acc_norm\": 0.8058748403575989,\n \"acc_norm_stderr\": 0.014143970276657569\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39553072625698327,\n \"acc_stderr\": 0.01635341541007577,\n \"acc_norm\": 0.39553072625698327,\n \"acc_norm_stderr\": 0.01635341541007577\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n \"acc_stderr\": 0.0127686730761119,\n \"acc_norm\": 0.4921773142112125,\n \"acc_norm_stderr\": 0.0127686730761119\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.02655651947004151,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.02655651947004151\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.01882421951270621,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.01882421951270621\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7170820046876212,\n \"mc2_stderr\": 0.014999711441421053\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343331\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6413949962092494,\n \"acc_stderr\": 0.013210317364134033\n }\n}\n```", "repo_url": "https://huggingface.co/kekmodel/StopCarbon-10.7B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|arc:challenge|25_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|gsm8k|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hellaswag|10_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T10-11-02.213915.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["**/details_harness|winogrande|5_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T10-11-02.213915.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T10_11_02.213915", "path": ["results_2023-12-30T10-11-02.213915.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T10-11-02.213915.parquet"]}]}]} | 2023-12-30T10:13:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v1
Dataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T10:11:02.213915(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T10:11:02.213915(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T10:11:02.213915(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v1\n\n\n\nDataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T10:11:02.213915(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
ef080fcad750a70ef9be5274b77772bc00e8285c |
# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kekmodel/StopCarbon-10.7B-v2](https://huggingface.co/kekmodel/StopCarbon-10.7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T10:14:30.385409](https://huggingface.co/datasets/open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v2/blob/main/results_2023-12-30T10-14-30.385409.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6660722389824969,
"acc_stderr": 0.03164239371492794,
"acc_norm": 0.6669225604349007,
"acc_norm_stderr": 0.0322865625835786,
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.720073875364873,
"mc2_stderr": 0.014935001382815729
},
"harness|arc:challenge|25": {
"acc": 0.6877133105802048,
"acc_stderr": 0.013542598541688067,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.7175861382194781,
"acc_stderr": 0.004492535748097628,
"acc_norm": 0.8859788886675961,
"acc_norm_stderr": 0.0031718733502514836
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4973544973544973,
"acc_stderr": 0.02575094967813039,
"acc_norm": 0.4973544973544973,
"acc_norm_stderr": 0.02575094967813039
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267822,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643527,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643527
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776678,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776678
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.023246202647819753,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.023246202647819753
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4908735332464146,
"acc_stderr": 0.01276810860164001,
"acc_norm": 0.4908735332464146,
"acc_norm_stderr": 0.01276810860164001
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.720073875364873,
"mc2_stderr": 0.014935001382815729
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237428
},
"harness|gsm8k|5": {
"acc": 0.6383623957543594,
"acc_stderr": 0.013234658351088766
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v2 | [
"region:us"
] | 2023-12-30T10:16:49+00:00 | {"pretty_name": "Evaluation run of kekmodel/StopCarbon-10.7B-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [kekmodel/StopCarbon-10.7B-v2](https://huggingface.co/kekmodel/StopCarbon-10.7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T10:14:30.385409](https://huggingface.co/datasets/open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v2/blob/main/results_2023-12-30T10-14-30.385409.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6660722389824969,\n \"acc_stderr\": 0.03164239371492794,\n \"acc_norm\": 0.6669225604349007,\n \"acc_norm_stderr\": 0.0322865625835786,\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.720073875364873,\n \"mc2_stderr\": 0.014935001382815729\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6877133105802048,\n \"acc_stderr\": 0.013542598541688067,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7175861382194781,\n \"acc_stderr\": 0.004492535748097628,\n \"acc_norm\": 0.8859788886675961,\n \"acc_norm_stderr\": 0.0031718733502514836\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4973544973544973,\n \"acc_stderr\": 0.02575094967813039,\n \"acc_norm\": 0.4973544973544973,\n \"acc_norm_stderr\": 0.02575094967813039\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n \"acc_stderr\": 0.022037217340267822,\n \"acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.022037217340267822\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643527,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643527\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.023246202647819753,\n \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.023246202647819753\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4908735332464146,\n \"acc_stderr\": 0.01276810860164001,\n \"acc_norm\": 0.4908735332464146,\n \"acc_norm_stderr\": 0.01276810860164001\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789527,\n \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789527\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.720073875364873,\n \"mc2_stderr\": 0.014935001382815729\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237428\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6383623957543594,\n \"acc_stderr\": 0.013234658351088766\n }\n}\n```", "repo_url": "https://huggingface.co/kekmodel/StopCarbon-10.7B-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|arc:challenge|25_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|gsm8k|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hellaswag|10_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T10-14-30.385409.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["**/details_harness|winogrande|5_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T10-14-30.385409.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T10_14_30.385409", "path": ["results_2023-12-30T10-14-30.385409.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T10-14-30.385409.parquet"]}]}]} | 2023-12-30T10:17:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v2
Dataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T10:14:30.385409(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v2\n\n\n\nDataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T10:14:30.385409(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v2\n\n\n\nDataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T10:14:30.385409(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v2\n\n\n\nDataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T10:14:30.385409(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
0a449c179f1b0c496151dd88b943693f30fecca9 |
# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kekmodel/StopCarbon-10.7B-v3](https://huggingface.co/kekmodel/StopCarbon-10.7B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T10:15:50.941228](https://huggingface.co/datasets/open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v3/blob/main/results_2023-12-30T10-15-50.941228.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6649827029825734,
"acc_stderr": 0.03166471620730208,
"acc_norm": 0.6659533597079996,
"acc_norm_stderr": 0.03230745700615819,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7193506614125464,
"mc2_stderr": 0.014949525122441177
},
"harness|arc:challenge|25": {
"acc": 0.6860068259385665,
"acc_stderr": 0.013562691224726293,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520764
},
"harness|hellaswag|10": {
"acc": 0.7180840470025891,
"acc_stderr": 0.004490130691020433,
"acc_norm": 0.8856801433977295,
"acc_norm_stderr": 0.003175490413694419
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267822,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857406,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857406
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643527,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643527
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776678,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776678
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381396,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381396
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.023246202647819753,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.023246202647819753
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4908735332464146,
"acc_stderr": 0.01276810860164001,
"acc_norm": 0.4908735332464146,
"acc_norm_stderr": 0.01276810860164001
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857834,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857834
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7193506614125464,
"mc2_stderr": 0.014949525122441177
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166732
},
"harness|gsm8k|5": {
"acc": 0.6322971948445792,
"acc_stderr": 0.013281630503395475
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v3 | [
"region:us"
] | 2023-12-30T10:18:07+00:00 | {"pretty_name": "Evaluation run of kekmodel/StopCarbon-10.7B-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [kekmodel/StopCarbon-10.7B-v3](https://huggingface.co/kekmodel/StopCarbon-10.7B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T10:15:50.941228](https://huggingface.co/datasets/open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v3/blob/main/results_2023-12-30T10-15-50.941228.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6649827029825734,\n \"acc_stderr\": 0.03166471620730208,\n \"acc_norm\": 0.6659533597079996,\n \"acc_norm_stderr\": 0.03230745700615819,\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7193506614125464,\n \"mc2_stderr\": 0.014949525122441177\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6860068259385665,\n \"acc_stderr\": 0.013562691224726293,\n \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520764\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7180840470025891,\n \"acc_stderr\": 0.004490130691020433,\n \"acc_norm\": 0.8856801433977295,\n \"acc_norm_stderr\": 0.003175490413694419\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n \"acc_stderr\": 0.022037217340267822,\n \"acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.022037217340267822\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643527,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643527\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n \"acc_stderr\": 0.014036945850381396,\n \"acc_norm\": 0.80970625798212,\n \"acc_norm_stderr\": 0.014036945850381396\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.023246202647819753,\n \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.023246202647819753\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4908735332464146,\n \"acc_stderr\": 0.01276810860164001,\n \"acc_norm\": 0.4908735332464146,\n \"acc_norm_stderr\": 0.01276810860164001\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789527,\n \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789527\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857834,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857834\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7193506614125464,\n \"mc2_stderr\": 0.014949525122441177\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166732\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6322971948445792,\n \"acc_stderr\": 0.013281630503395475\n }\n}\n```", "repo_url": "https://huggingface.co/kekmodel/StopCarbon-10.7B-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|arc:challenge|25_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|gsm8k|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hellaswag|10_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T10-15-50.941228.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["**/details_harness|winogrande|5_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T10-15-50.941228.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T10_15_50.941228", "path": ["results_2023-12-30T10-15-50.941228.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T10-15-50.941228.parquet"]}]}]} | 2023-12-30T10:18:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v3
Dataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T10:15:50.941228(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v3\n\n\n\nDataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T10:15:50.941228(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v3\n\n\n\nDataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T10:15:50.941228(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v3\n\n\n\nDataset automatically created during the evaluation run of model kekmodel/StopCarbon-10.7B-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T10:15:50.941228(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
044dbd393ec46320124d594db9b3273a999ba506 | Overview
This dataset provides a collection of over 35,000 tokens of text adhering to the New York Times writing style guide. The data is formatted in JSON and is suitable for various natural language processing tasks, text generation, style transfer, and more.
Key Features
Format: JSON
Number of tokens: 35,000+
Language model used: Notux 8x7B v1
License: MIT open-source license
Accessibility: Freely available for use
Usage
This dataset can be used for a wide range of applications, including:
Text generation: Train language models to generate text that aligns with the NYT writing style.
Style transfer: Adapt existing text to match the NYT style guide.
Content analysis: Analyze the linguistic patterns and characteristics of NYT writing.
Educational purposes: Teach and learn about writing style and its impact on communication.
Technical Details
File format: JSON
Character encoding: UTF-8
Data structure: Array of objects, each representing a token with its corresponding text and metadata.
Personal Kritik
I believe that data, like information, should not be confined to the domain of any single person or entity. It should be freely accessible and shared for the benefit of all. This dataset is released under an open-source license to promote this philosophy and encourage open collaboration and knowledge sharing.
Acknowledgments
The creation of this dataset was made possible by Notux 8x7B v1 and the generosity of those who contributed to its development.
License
This dataset is licensed under the MIT open-source license. | TuringsSolutions/NYTWritingStyleGuide | [
"license:mit",
"region:us"
] | 2023-12-30T10:19:15+00:00 | {"license": "mit"} | 2023-12-30T10:31:25+00:00 | [] | [] | TAGS
#license-mit #region-us
| Overview
This dataset provides a collection of over 35,000 tokens of text adhering to the New York Times writing style guide. The data is formatted in JSON and is suitable for various natural language processing tasks, text generation, style transfer, and more.
Key Features
Format: JSON
Number of tokens: 35,000+
Language model used: Notux 8x7B v1
License: MIT open-source license
Accessibility: Freely available for use
Usage
This dataset can be used for a wide range of applications, including:
Text generation: Train language models to generate text that aligns with the NYT writing style.
Style transfer: Adapt existing text to match the NYT style guide.
Content analysis: Analyze the linguistic patterns and characteristics of NYT writing.
Educational purposes: Teach and learn about writing style and its impact on communication.
Technical Details
File format: JSON
Character encoding: UTF-8
Data structure: Array of objects, each representing a token with its corresponding text and metadata.
Personal Kritik
I believe that data, like information, should not be confined to the domain of any single person or entity. It should be freely accessible and shared for the benefit of all. This dataset is released under an open-source license to promote this philosophy and encourage open collaboration and knowledge sharing.
Acknowledgments
The creation of this dataset was made possible by Notux 8x7B v1 and the generosity of those who contributed to its development.
License
This dataset is licensed under the MIT open-source license. | [] | [
"TAGS\n#license-mit #region-us \n"
] | [
11
] | [
"passage: TAGS\n#license-mit #region-us \n"
] |
df85848c7be3a160c4746ea78311299f7cf61861 |
SimpleQuestion dataset based on Wikidata with labels
Source https://github.com/askplatypus/wikidata-simplequestions/tree/master
Wikidata Dump *Aug 2, 2023* | s-nlp/sqwd | [
"task_categories:question-answering",
"language:en",
"region:us"
] | 2023-12-30T10:19:47+00:00 | {"language": ["en"], "task_categories": ["question-answering"], "pretty_name": "SimpleQuestions Wikidata", "dataset_info": {"config_name": "answerable", "features": [{"name": "subject", "dtype": "string"}, {"name": "property", "dtype": "string"}, {"name": "object", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "object_label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1737046, "num_examples": 19481}, {"name": "valid", "num_bytes": 251802, "num_examples": 2821}, {"name": "test", "num_bytes": 502252, "num_examples": 5622}], "download_size": 1555624, "dataset_size": 2491100}, "configs": [{"config_name": "answerable", "data_files": [{"split": "train", "path": "answerable/train-*"}, {"split": "valid", "path": "answerable/valid-*"}, {"split": "test", "path": "answerable/test-*"}], "default": true}]} | 2023-12-30T10:25:11+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #language-English #region-us
|
SimpleQuestion dataset based on Wikidata with labels
Source URL
Wikidata Dump *Aug 2, 2023* | [] | [
"TAGS\n#task_categories-question-answering #language-English #region-us \n"
] | [
22
] | [
"passage: TAGS\n#task_categories-question-answering #language-English #region-us \n"
] |
0e55bcfafe1b2e410a596a073ab4a81d0dc930aa |
# Dataset Card for Evaluation run of sophosympatheia/Aurora-Nights-70B-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sophosympatheia/Aurora-Nights-70B-v1.0](https://huggingface.co/sophosympatheia/Aurora-Nights-70B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sophosympatheia__Aurora-Nights-70B-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T10:37:31.144235](https://huggingface.co/datasets/open-llm-leaderboard/details_sophosympatheia__Aurora-Nights-70B-v1.0/blob/main/results_2023-12-30T10-37-31.144235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7054744523563992,
"acc_stderr": 0.030133399589619393,
"acc_norm": 0.7078376241180532,
"acc_norm_stderr": 0.03072510235749947,
"mc1": 0.4528763769889841,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6281358101050266,
"mc2_stderr": 0.014981280535224054
},
"harness|arc:challenge|25": {
"acc": 0.6732081911262798,
"acc_stderr": 0.013706665975587336,
"acc_norm": 0.7133105802047781,
"acc_norm_stderr": 0.013214986329274774
},
"harness|hellaswag|10": {
"acc": 0.6980681139215296,
"acc_stderr": 0.0045815761241797485,
"acc_norm": 0.8832901812387971,
"acc_norm_stderr": 0.003204180072942386
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.031103182383123384,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.031103182383123384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.02749566368372406,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.02749566368372406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267438,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267438
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6893617021276596,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.6893617021276596,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.025634258115554958,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.025634258115554958
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.02325315795194209,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.02325315795194209
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.735897435897436,
"acc_stderr": 0.02235219373745328,
"acc_norm": 0.735897435897436,
"acc_norm_stderr": 0.02235219373745328
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8899082568807339,
"acc_stderr": 0.013419939018681203,
"acc_norm": 0.8899082568807339,
"acc_norm_stderr": 0.013419939018681203
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746793,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746793
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208274,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.03154521672005473,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.03154521672005473
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622814,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622814
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869621,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869621
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822582,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822582
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899091,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899091
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.012331009307795656,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.012331009307795656
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.0218552552634218,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.0218552552634218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5195530726256983,
"acc_stderr": 0.016709709877662,
"acc_norm": 0.5195530726256983,
"acc_norm_stderr": 0.016709709877662
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.023929155517351294,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.023929155517351294
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.023839303311398195,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.023839303311398195
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8302469135802469,
"acc_stderr": 0.020888690414093868,
"acc_norm": 0.8302469135802469,
"acc_norm_stderr": 0.020888690414093868
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5815602836879432,
"acc_stderr": 0.029427994039420004,
"acc_norm": 0.5815602836879432,
"acc_norm_stderr": 0.029427994039420004
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5567144719687093,
"acc_stderr": 0.012687818419599916,
"acc_norm": 0.5567144719687093,
"acc_norm_stderr": 0.012687818419599916
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.017282760695167418,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.017282760695167418
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.040693063197213775,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.040693063197213775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546188,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546188
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4528763769889841,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6281358101050266,
"mc2_stderr": 0.014981280535224054
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781091
},
"harness|gsm8k|5": {
"acc": 0.6633813495072024,
"acc_stderr": 0.013016463679983359
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_sophosympatheia__Aurora-Nights-70B-v1.0 | [
"region:us"
] | 2023-12-30T10:39:54+00:00 | {"pretty_name": "Evaluation run of sophosympatheia/Aurora-Nights-70B-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [sophosympatheia/Aurora-Nights-70B-v1.0](https://huggingface.co/sophosympatheia/Aurora-Nights-70B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sophosympatheia__Aurora-Nights-70B-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T10:37:31.144235](https://huggingface.co/datasets/open-llm-leaderboard/details_sophosympatheia__Aurora-Nights-70B-v1.0/blob/main/results_2023-12-30T10-37-31.144235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7054744523563992,\n \"acc_stderr\": 0.030133399589619393,\n \"acc_norm\": 0.7078376241180532,\n \"acc_norm_stderr\": 0.03072510235749947,\n \"mc1\": 0.4528763769889841,\n \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6281358101050266,\n \"mc2_stderr\": 0.014981280535224054\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587336,\n \"acc_norm\": 0.7133105802047781,\n \"acc_norm_stderr\": 0.013214986329274774\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6980681139215296,\n \"acc_stderr\": 0.0045815761241797485,\n \"acc_norm\": 0.8832901812387971,\n \"acc_norm_stderr\": 0.003204180072942386\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.031103182383123384,\n \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.031103182383123384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.02749566368372406,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.02749566368372406\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267438,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267438\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6893617021276596,\n \"acc_stderr\": 0.03025123757921317,\n \"acc_norm\": 0.6893617021276596,\n \"acc_norm_stderr\": 0.03025123757921317\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.025634258115554958,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.025634258115554958\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.02325315795194209,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.02325315795194209\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.735897435897436,\n \"acc_stderr\": 0.02235219373745328,\n \"acc_norm\": 0.735897435897436,\n \"acc_norm_stderr\": 0.02235219373745328\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279472,\n \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279472\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8899082568807339,\n \"acc_stderr\": 0.013419939018681203,\n \"acc_norm\": 0.8899082568807339,\n \"acc_norm_stderr\": 0.013419939018681203\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746793,\n \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746793\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.03154521672005473,\n \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.03154521672005473\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.03343270062869621,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.03343270062869621\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822582,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822582\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899091,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899091\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8620689655172413,\n \"acc_stderr\": 0.012331009307795656,\n \"acc_norm\": 0.8620689655172413,\n \"acc_norm_stderr\": 0.012331009307795656\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.0218552552634218,\n \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.0218552552634218\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5195530726256983,\n \"acc_stderr\": 0.016709709877662,\n \"acc_norm\": 0.5195530726256983,\n \"acc_norm_stderr\": 0.016709709877662\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.023929155517351294,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.023929155517351294\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n \"acc_stderr\": 0.023839303311398195,\n \"acc_norm\": 0.7717041800643086,\n \"acc_norm_stderr\": 0.023839303311398195\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.020888690414093868,\n \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.020888690414093868\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5815602836879432,\n \"acc_stderr\": 0.029427994039420004,\n \"acc_norm\": 0.5815602836879432,\n \"acc_norm_stderr\": 0.029427994039420004\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5567144719687093,\n \"acc_stderr\": 0.012687818419599916,\n \"acc_norm\": 0.5567144719687093,\n \"acc_norm_stderr\": 0.012687818419599916\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.017282760695167418,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.017282760695167418\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.040693063197213775,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.040693063197213775\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.025000256039546188,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.025000256039546188\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327692,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327692\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4528763769889841,\n \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6281358101050266,\n \"mc2_stderr\": 0.014981280535224054\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781091\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6633813495072024,\n \"acc_stderr\": 0.013016463679983359\n }\n}\n```", "repo_url": "https://huggingface.co/sophosympatheia/Aurora-Nights-70B-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|arc:challenge|25_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|gsm8k|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hellaswag|10_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T10-37-31.144235.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["**/details_harness|winogrande|5_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T10-37-31.144235.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_30T10_37_31.144235", "path": ["results_2023-12-30T10-37-31.144235.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T10-37-31.144235.parquet"]}]}]} | 2023-12-30T10:40:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of sophosympatheia/Aurora-Nights-70B-v1.0
Dataset automatically created during the evaluation run of model sophosympatheia/Aurora-Nights-70B-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-30T10:37:31.144235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of sophosympatheia/Aurora-Nights-70B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model sophosympatheia/Aurora-Nights-70B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T10:37:31.144235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of sophosympatheia/Aurora-Nights-70B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model sophosympatheia/Aurora-Nights-70B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-30T10:37:31.144235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of sophosympatheia/Aurora-Nights-70B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model sophosympatheia/Aurora-Nights-70B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T10:37:31.144235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
83ad7ebf8b2a7f0e5d56db4fd6203e67c0c6fdcd | # Dataset Card for RLHF-V-Dataset
[Project Page](https://rlhf-v.github.io/) | [Paper](https://arxiv.org/abs/2312.00849) | [GitHub](https://github.com/RLHF-V/RLHF-V)
## Updates
**[2024.01.06]** 🔥 **A larger, more diverse set of fine-grained human correction data is available now!** 🔥 The newly released data has about **5.7k of fine-grained human correction data** that covers the output of **more powerful models** (Qwen-VL-Chat, InstructBLIP, etc.). We also **expand the image types** from everyday scenes to diverse styles and themes (WikiArt, landmarks, scene texts, etc.).
**[2024.01.05]** 🔧 We reformat our dataset and now it is **more convenient to preview and use** our data! The dataset now supports the `load_dataset` function, and the data content can be easily previewed online.
**[2023.12.15]** We incorporated a new annotation subset with an additional **1065 fine-grained annotations** into our dataset !
## Dataset Summary
RLHF-V-Dataset is the human preference data used in "**RLHF-V: Towards Trustworthy MLLMs via Behavior Alignment from Fine-grained Correctional Human Feedback**".
We originally collected a large amount of **fine-grained segment-level human corrections** on diverse instructions, including detailed descriptions and question-answering instructions. More high-quality annotations for different image sources and model outputs are on the way.
<p align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/6566e0c493e30c8a60048eb3/jerEZiHDDc2ceF9anVHR-.png" alt="fig1" width="60%"/>
</p>
Utilizing our dataset can dramatically **reduce model hallucinations by 34.8%** while **keeping informativeness**.
<p align="center">
<img src="https://cdn-uploads.huggingface.co/production/uploads/6566e0c493e30c8a60048eb3/7xJEdKXeW33iKdHqJwvNN.png" alt="fig2" width="70%"/>
</p>
## Usage
```python
from datasets import load_dataset
data = load_dataset("HaoyeZhang/RLHF-V-Dataset")
```
## Data fields
| | Key | Description |
| ---- | ---------------- | ------------------------------------------------------------ |
| 0 | `ds_name` | Dataset name. |
| 1 | `image` | Dict contains path and bytes. If loaded by `load_dataset`, it can be automatically converted into a PIL Image. |
| 2 | `text` | Preference data. Each data item contains a dict with the keys "question", "chosen", and "rejected". |
| 3 | `origin_dataset` | Original dataset for annotation, which is not used in training. |
| 4 | `origin_split` | Meta information for each data item, including the name of the model we use to generate the original answer, and the question type ("detailed description" or "question answering") |
| 5 | `idx` | Data index. |
| 6 | `image_path` | Image path. |
## Citation
```
@article{2023rlhf-v,
author = {Tianyu Yu and Yuan Yao and Haoye Zhang and Taiwen He and Yifeng Han and Ganqu Cui and Jinyi Hu and Zhiyuan Liu and Hai-Tao Zheng and Maosong Sun and Tat-Seng Chua},
title = {RLHF-V: Towards Trustworthy MLLMs via Behavior Alignment from Fine-grained Correctional Human Feedback},
journal = {arxiv},
year = {2023},
}
``` | HaoyeZhang/RLHF-V-Dataset | [
"task_categories:conversational",
"task_categories:text-generation",
"task_categories:visual-question-answering",
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-nc-4.0",
"arxiv:2312.00849",
"region:us"
] | 2023-12-30T11:35:38+00:00 | {"language": ["en"], "license": "cc-by-nc-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["conversational", "text-generation", "visual-question-answering"], "pretty_name": "RLHF-V-Dataset", "configs": [{"config_name": "default", "data_files": "RLHF-V-Dataset.parquet"}], "dataset_info": {"features": [{"name": "ds_name", "dtype": "string"}, {"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "origin_dataset", "dtype": "string"}, {"name": "origin_split", "dtype": "string"}, {"name": "idx", "dtype": "int64"}, {"name": "image_path", "dtype": "string"}]}} | 2024-01-07T15:09:35+00:00 | [
"2312.00849"
] | [
"en"
] | TAGS
#task_categories-conversational #task_categories-text-generation #task_categories-visual-question-answering #size_categories-1K<n<10K #language-English #license-cc-by-nc-4.0 #arxiv-2312.00849 #region-us
| Dataset Card for RLHF-V-Dataset
===============================
Project Page | Paper | GitHub
Updates
-------
[2024.01.06] A larger, more diverse set of fine-grained human correction data is available now! The newly released data has about 5.7k of fine-grained human correction data that covers the output of more powerful models (Qwen-VL-Chat, InstructBLIP, etc.). We also expand the image types from everyday scenes to diverse styles and themes (WikiArt, landmarks, scene texts, etc.).
[2024.01.05] We reformat our dataset and now it is more convenient to preview and use our data! The dataset now supports the 'load\_dataset' function, and the data content can be easily previewed online.
[2023.12.15] We incorporated a new annotation subset with an additional 1065 fine-grained annotations into our dataset !
Dataset Summary
---------------
RLHF-V-Dataset is the human preference data used in "RLHF-V: Towards Trustworthy MLLMs via Behavior Alignment from Fine-grained Correctional Human Feedback".
We originally collected a large amount of fine-grained segment-level human corrections on diverse instructions, including detailed descriptions and question-answering instructions. More high-quality annotations for different image sources and model outputs are on the way.

Utilizing our dataset can dramatically reduce model hallucinations by 34.8% while keeping informativeness.

Usage
-----
Data fields
-----------
Key: 0, Description: 'ds\_name'
Key: 1, Description: 'image'
Key: 2, Description: 'text'
Key: 3, Description: 'origin\_dataset'
Key: 4, Description: 'origin\_split'
Key: 5, Description: 'idx'
Key: 6, Description: 'image\_path'
| [] | [
"TAGS\n#task_categories-conversational #task_categories-text-generation #task_categories-visual-question-answering #size_categories-1K<n<10K #language-English #license-cc-by-nc-4.0 #arxiv-2312.00849 #region-us \n"
] | [
77
] | [
"passage: TAGS\n#task_categories-conversational #task_categories-text-generation #task_categories-visual-question-answering #size_categories-1K<n<10K #language-English #license-cc-by-nc-4.0 #arxiv-2312.00849 #region-us \n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.