sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
listlengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
listlengths 0
25
| languages
listlengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
listlengths 0
352
| processed_texts
listlengths 1
353
| tokens_length
listlengths 1
353
| input_texts
listlengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
978ad82192f9ca1e447a67f11eaf2490e7230744
|
UPLOADED FOR EDUCATIONAL PURPOSE
Audio dataset created from Gothic 1 video game files, from the laguage version of German, English, Polish and Russian (Snowball version) dubbings.
It contains bot the wav audio files of seperated dialogue as well as the text transcription file per NPC.
example of data structure:<br><br>
g1_[language].zip<br>
-[folder-characterName-]bob(folder)<br>
--Audio1.wav<br>
--Audio2.wav<br>
-[characterName-]steve(folder)<br>
--Audio3.wav<br>
--Audio4.wav<br>
-[characterName]bob.txt<br>
-[characterName]steve.txt<br><br><br>
<br>Here is process of extracting the audio files from Gothic game using the modding tools as well as the Python script for formatting the text transcription dialogue<br>
https://youtu.be/2iV-FWuiVFo
|
Amo/Gothic-1-multilingual-dialogue
|
[
"size_categories:1B<n<4B",
"language:en",
"language:de",
"language:pl",
"language:ru",
"region:us"
] |
2023-08-31T16:02:25+00:00
|
{"language": ["en", "de", "pl", "ru"], "size_categories": ["1B<n<4B"]}
|
2023-08-31T18:48:47+00:00
|
[] |
[
"en",
"de",
"pl",
"ru"
] |
TAGS
#size_categories-1B<n<4B #language-English #language-German #language-Polish #language-Russian #region-us
|
UPLOADED FOR EDUCATIONAL PURPOSE
Audio dataset created from Gothic 1 video game files, from the laguage version of German, English, Polish and Russian (Snowball version) dubbings.
It contains bot the wav audio files of seperated dialogue as well as the text transcription file per NPC.
example of data structure:<br><br>
g1_[language].zip<br>
-[folder-characterName-]bob(folder)<br>
--URL<br>
--URL<br>
-[characterName-]steve(folder)<br>
--URL<br>
--URL<br>
-[characterName]URL<br>
-[characterName]URL<br><br><br>
<br>Here is process of extracting the audio files from Gothic game using the modding tools as well as the Python script for formatting the text transcription dialogue<br>
URL
|
[] |
[
"TAGS\n#size_categories-1B<n<4B #language-English #language-German #language-Polish #language-Russian #region-us \n"
] |
[
36
] |
[
"passage: TAGS\n#size_categories-1B<n<4B #language-English #language-German #language-Polish #language-Russian #region-us \n"
] |
998047a5221430e69860f07ab83cfd994e5391db
|
# Dataset Card for Evaluation run of lgaalves/gpt2_open-platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/gpt2_open-platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/gpt2_open-platypus](https://huggingface.co/lgaalves/gpt2_open-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__gpt2_open-platypus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T13:45:26.230063](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_open-platypus/blob/main/results_2023-10-15T13-45-26.230063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964607695,
"f1": 0.04636010906040263,
"f1_stderr": 0.0012972722820894797,
"acc": 0.25726959447047076,
"acc_stderr": 0.007559748871273466
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964607695,
"f1": 0.04636010906040263,
"f1_stderr": 0.0012972722820894797
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492632
},
"harness|winogrande|5": {
"acc": 0.5130228887134964,
"acc_stderr": 0.01404771839399767
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_lgaalves__gpt2_open-platypus
|
[
"region:us"
] |
2023-08-31T16:11:23+00:00
|
{"pretty_name": "Evaluation run of lgaalves/gpt2_open-platypus", "dataset_summary": "Dataset automatically created during the evaluation run of model [lgaalves/gpt2_open-platypus](https://huggingface.co/lgaalves/gpt2_open-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__gpt2_open-platypus\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T13:45:26.230063](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_open-platypus/blob/main/results_2023-10-15T13-45-26.230063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964607695,\n \"f1\": 0.04636010906040263,\n \"f1_stderr\": 0.0012972722820894797,\n \"acc\": 0.25726959447047076,\n \"acc_stderr\": 0.007559748871273466\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964607695,\n \"f1\": 0.04636010906040263,\n \"f1_stderr\": 0.0012972722820894797\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \"acc_stderr\": 0.0010717793485492632\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5130228887134964,\n \"acc_stderr\": 0.01404771839399767\n }\n}\n```", "repo_url": "https://huggingface.co/lgaalves/gpt2_open-platypus", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|arc:challenge|25_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T13_45_26.230063", "path": ["**/details_harness|drop|3_2023-10-15T13-45-26.230063.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T13-45-26.230063.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T13_45_26.230063", "path": ["**/details_harness|gsm8k|5_2023-10-15T13-45-26.230063.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T13-45-26.230063.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hellaswag|10_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T17:11:08.445217.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T17:11:08.445217.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T17:11:08.445217.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T13_45_26.230063", "path": ["**/details_harness|winogrande|5_2023-10-15T13-45-26.230063.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T13-45-26.230063.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T17_11_08.445217", "path": ["results_2023-08-31T17:11:08.445217.parquet"]}, {"split": "2023_10_15T13_45_26.230063", "path": ["results_2023-10-15T13-45-26.230063.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T13-45-26.230063.parquet"]}]}]}
|
2023-10-15T12:45:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of lgaalves/gpt2_open-platypus
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model lgaalves/gpt2_open-platypus on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T13:45:26.230063(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of lgaalves/gpt2_open-platypus",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2_open-platypus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T13:45:26.230063(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of lgaalves/gpt2_open-platypus",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2_open-platypus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T13:45:26.230063(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lgaalves/gpt2_open-platypus## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2_open-platypus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T13:45:26.230063(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
18a799fcd66f1db62ea7fd8c1b6c1ddf2a44a09a
|
# Dataset Card for "srbd1_segmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Lancelot53/srbd1_segmented
|
[
"region:us"
] |
2023-08-31T16:13:15+00:00
|
{"dataset_info": {"features": [{"name": "html", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1446076, "num_examples": 1496}], "download_size": 0, "dataset_size": 1446076}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-01T12:58:29+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "srbd1_segmented"
More Information needed
|
[
"# Dataset Card for \"srbd1_segmented\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"srbd1_segmented\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"srbd1_segmented\"\n\nMore Information needed"
] |
5434c6e98ac65fe4b42f424e6b417724d3eb0098
|
# MQA
Aggregation of datasets as per [here](https://huggingface.co/collections/euclaise/mqa-650f41afae507a2c7ca18b55)
I reserve no rights to the dataset, but the original datasets were made available under various public licenses. Hence, consider each subset of this dataset to be licensed as the original dataset from where it comes was.
|
euclaise/mqa
|
[
"task_categories:question-answering",
"size_categories:10K<n<100K",
"region:us"
] |
2023-08-31T16:15:10+00:00
|
{"size_categories": ["10K<n<100K"], "task_categories": ["question-answering"], "pretty_name": "MultiQA", "dataset_info": {"features": [{"name": "msg", "dtype": "string"}, {"name": "resp_correct", "dtype": "string"}, {"name": "resp_incorrect", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 20624051.02310231, "num_examples": 23408}], "download_size": 18672769, "dataset_size": 20624051.02310231}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-10-20T16:13:22+00:00
|
[] |
[] |
TAGS
#task_categories-question-answering #size_categories-10K<n<100K #region-us
|
# MQA
Aggregation of datasets as per here
I reserve no rights to the dataset, but the original datasets were made available under various public licenses. Hence, consider each subset of this dataset to be licensed as the original dataset from where it comes was.
|
[
"# MQA\n\nAggregation of datasets as per here\n\nI reserve no rights to the dataset, but the original datasets were made available under various public licenses. Hence, consider each subset of this dataset to be licensed as the original dataset from where it comes was."
] |
[
"TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #region-us \n",
"# MQA\n\nAggregation of datasets as per here\n\nI reserve no rights to the dataset, but the original datasets were made available under various public licenses. Hence, consider each subset of this dataset to be licensed as the original dataset from where it comes was."
] |
[
30,
63
] |
[
"passage: TAGS\n#task_categories-question-answering #size_categories-10K<n<100K #region-us \n# MQA\n\nAggregation of datasets as per here\n\nI reserve no rights to the dataset, but the original datasets were made available under various public licenses. Hence, consider each subset of this dataset to be licensed as the original dataset from where it comes was."
] |
62cbed9726deb4b874a209dc8e0b32029100e259
|

# 📔 **DATASET**
| **Dataset** | Class | Number of Questions |
| ------- | ----------------------------------------------------------------- | ------------------------ |
| **FLAN_CoT(zs)** | Reasoning 、 MATH 、 ScienceQA 、 Commonsense | 8000 |
| **Prm800k** | Reasoning 、 MATH | 6713 |
| **ScienceQA** | ScienceQA | 5177 |
| **SciBench** | ScienceQA | 695 |
| **ReClor** | Reasoning | 1624 |
| **TheoremQA** | Commonsense 、 MATH 、 ScienceQA | 800 |
| **OpenBookQA** | Text_Understanding 、 Reasoning 、 Commonsense 、 ScienceQA | 5957 |
| **ARB** | Reasoning 、 MATH 、 ScienceQA 、 Commonsense 、 Text_Understanding | 605 |
| **Openassistant-guanaco** | Commonsense 、 Text_Understanding 、 Reasoning | 802 |
# 📌 **Methon**
## *Improving the dataset*
In addition to evaluating based on the finetune1 score, we found that previously fine-tuning on the 200,000, 50,000, and 5,000 datasets for 'dolphin' and 'openorca' resulted in scores better than the 200,000 dataset. Therefore, we plan to prioritize testing with a smaller yet high-quality dataset. We will use the 'platypus' dataset and combine it with 'cot' for stratified sampling and optimizing the dataset based on output length.
## *Dataset Format Definition*
Use "instruction、input、output" tend to lean towards guided datasets. In this format, each sample includes an instruction, an input, and an expected output. The instruction provides guidance on how to process the input to generate the output. This format of dataset is often used to train models to perform specific tasks, as they explicitly indicate the operations the model should perform.
```
{
"input": "",
"output": "",
"instruction": ""
}
```
- ### [FLAN_V2 COT(ZS)](https://huggingface.co/datasets/conceptofmind/cot_submix_original/tree/main)
We only extract the 'zs_opt' from COT and categorize each task.
- ### [OTHER](https://github.com/arielnlee/Platypus/tree/main/data_pipeline)
Prm800k, ScienceQA, SciBench, ReClor, TheoremQA, OpenBookQA, ARB, and OpenAssistant-Guanaco datasets adopt the same format as Platypus.
## *Sampling Algorithms*
Since the flan_v2 cot dataset includes tasks like:
- cot_esnli
- cot_strategyqa
- cot_qasc
- stream_qed
- cot_gsm8k
- cot_ecqa
- cot_creak
- stream_aqua
To ensure this dataset contains diverse high-quality data, we first select zs_opt questions. Then, we filter out questions with output lengths exceeding the average length. This step aims to help the model learn richer reasoning steps. After that, we perform stratified sampling. Initially, we attempted stratified sampling before the length-based filtering, but we found that this approach resulted in varying sample sizes, making it challenging to reproduce. Thus, we decided to first filter by length and then perform stratified sampling.
```py
import json
import random
with open("cot_ORIGINAL.json", "r") as f:
abc = json.load(f)
# --- part1 ---
zsopt_data = [] # "zs_opt"
for i in abc :
if i["template_type"] == "zs_opt":
zsopt_data.append(i)
# --- part2 ---
output_lengths = [len(i["targets"]) for i in zsopt_data]
average_length = sum(output_lengths) / len(output_lengths) # average length
filtered_data = []
for a in zsopt_data:
if len(a["targets"]) >= average_length:
filtered_data.append(a) # output length need to >= average_length
class_counts = {} # Count the number of samples for each class
for a in filtered_data:
task_name = a["task_name"]
if task_name in class_counts:
class_counts[task_name] += 1
else:
class_counts[task_name] = 1
# --- part3 ---
total_samples = 8000 # we plan to select a total of 8000 samples
sample_ratios = {}
for task_name, count in class_counts.items():
sample_ratios[task_name] = count / len(filtered_data)
sample_sizes = {}
for task_name, sample_ratio in sample_ratios.items():
sample_sizes[task_name] = round(sample_ratio * total_samples)
stratified_samples = {} # Perform stratified sampling for each class
for task_name, sample_size in sample_sizes.items():
class_samples = []
for data in filtered_data:
if data["task_name"] == task_name:
class_samples.append(data)
selected_samples = random.sample(class_samples, sample_size)
stratified_samples[task_name] = selected_samples
final_samples = [] # Convert to the specified format
for task_name, samples in stratified_samples.items():
for sample in samples:
final_samples.append(
{
"input": "", # use ""
"output": sample["targets"], # output
"instruction": sample["inputs"], # question
}
)
with open("cot_change.json", "w") as f:
json.dump(final_samples, f, indent=2)
```
# 🏁 **Feature Work**
Under discussion...
|
huangyt/FINETUNE2
|
[
"license:openrail",
"region:us"
] |
2023-08-31T16:37:52+00:00
|
{"license": "openrail"}
|
2023-09-01T08:40:00+00:00
|
[] |
[] |
TAGS
#license-openrail #region-us
|
!Change can be sunshine if you let it in. (1).png
DATASET
=======
Dataset: FLAN\_CoT(zs), Class: Reasoning 、 MATH 、 ScienceQA 、 Commonsense, Number of Questions: 8000
Dataset: Prm800k, Class: Reasoning 、 MATH, Number of Questions: 6713
Dataset: ScienceQA, Class: ScienceQA, Number of Questions: 5177
Dataset: SciBench, Class: ScienceQA, Number of Questions: 695
Dataset: ReClor, Class: Reasoning, Number of Questions: 1624
Dataset: TheoremQA, Class: Commonsense 、 MATH 、 ScienceQA, Number of Questions: 800
Dataset: OpenBookQA, Class: Text\_Understanding 、 Reasoning 、 Commonsense 、 ScienceQA, Number of Questions: 5957
Dataset: ARB, Class: Reasoning 、 MATH 、 ScienceQA 、 Commonsense 、 Text\_Understanding, Number of Questions: 605
Dataset: Openassistant-guanaco, Class: Commonsense 、 Text\_Understanding 、 Reasoning, Number of Questions: 802
Methon
======
*Improving the dataset*
-----------------------
In addition to evaluating based on the finetune1 score, we found that previously fine-tuning on the 200,000, 50,000, and 5,000 datasets for 'dolphin' and 'openorca' resulted in scores better than the 200,000 dataset. Therefore, we plan to prioritize testing with a smaller yet high-quality dataset. We will use the 'platypus' dataset and combine it with 'cot' for stratified sampling and optimizing the dataset based on output length.
*Dataset Format Definition*
---------------------------
Use "instruction、input、output" tend to lean towards guided datasets. In this format, each sample includes an instruction, an input, and an expected output. The instruction provides guidance on how to process the input to generate the output. This format of dataset is often used to train models to perform specific tasks, as they explicitly indicate the operations the model should perform.
* ### FLAN\_V2 COT(ZS)
We only extract the 'zs\_opt' from COT and categorize each task.
* ### OTHER
Prm800k, ScienceQA, SciBench, ReClor, TheoremQA, OpenBookQA, ARB, and OpenAssistant-Guanaco datasets adopt the same format as Platypus.
*Sampling Algorithms*
---------------------
Since the flan\_v2 cot dataset includes tasks like:
* cot\_esnli
* cot\_strategyqa
* cot\_qasc
* stream\_qed
* cot\_gsm8k
* cot\_ecqa
* cot\_creak
* stream\_aqua
To ensure this dataset contains diverse high-quality data, we first select zs\_opt questions. Then, we filter out questions with output lengths exceeding the average length. This step aims to help the model learn richer reasoning steps. After that, we perform stratified sampling. Initially, we attempted stratified sampling before the length-based filtering, but we found that this approach resulted in varying sample sizes, making it challenging to reproduce. Thus, we decided to first filter by length and then perform stratified sampling.
Feature Work
============
Under discussion...
|
[
"### FLAN\\_V2 COT(ZS)\n\n\nWe only extract the 'zs\\_opt' from COT and categorize each task.\n* ### OTHER\n\n\nPrm800k, ScienceQA, SciBench, ReClor, TheoremQA, OpenBookQA, ARB, and OpenAssistant-Guanaco datasets adopt the same format as Platypus.\n\n\n*Sampling Algorithms*\n---------------------\n\n\nSince the flan\\_v2 cot dataset includes tasks like:\n\n\n* cot\\_esnli\n* cot\\_strategyqa\n* cot\\_qasc\n* stream\\_qed\n* cot\\_gsm8k\n* cot\\_ecqa\n* cot\\_creak\n* stream\\_aqua\n\n\nTo ensure this dataset contains diverse high-quality data, we first select zs\\_opt questions. Then, we filter out questions with output lengths exceeding the average length. This step aims to help the model learn richer reasoning steps. After that, we perform stratified sampling. Initially, we attempted stratified sampling before the length-based filtering, but we found that this approach resulted in varying sample sizes, making it challenging to reproduce. Thus, we decided to first filter by length and then perform stratified sampling.\n\n\nFeature Work\n============\n\n\nUnder discussion..."
] |
[
"TAGS\n#license-openrail #region-us \n",
"### FLAN\\_V2 COT(ZS)\n\n\nWe only extract the 'zs\\_opt' from COT and categorize each task.\n* ### OTHER\n\n\nPrm800k, ScienceQA, SciBench, ReClor, TheoremQA, OpenBookQA, ARB, and OpenAssistant-Guanaco datasets adopt the same format as Platypus.\n\n\n*Sampling Algorithms*\n---------------------\n\n\nSince the flan\\_v2 cot dataset includes tasks like:\n\n\n* cot\\_esnli\n* cot\\_strategyqa\n* cot\\_qasc\n* stream\\_qed\n* cot\\_gsm8k\n* cot\\_ecqa\n* cot\\_creak\n* stream\\_aqua\n\n\nTo ensure this dataset contains diverse high-quality data, we first select zs\\_opt questions. Then, we filter out questions with output lengths exceeding the average length. This step aims to help the model learn richer reasoning steps. After that, we perform stratified sampling. Initially, we attempted stratified sampling before the length-based filtering, but we found that this approach resulted in varying sample sizes, making it challenging to reproduce. Thus, we decided to first filter by length and then perform stratified sampling.\n\n\nFeature Work\n============\n\n\nUnder discussion..."
] |
[
12,
311
] |
[
"passage: TAGS\n#license-openrail #region-us \n### FLAN\\_V2 COT(ZS)\n\n\nWe only extract the 'zs\\_opt' from COT and categorize each task.\n* ### OTHER\n\n\nPrm800k, ScienceQA, SciBench, ReClor, TheoremQA, OpenBookQA, ARB, and OpenAssistant-Guanaco datasets adopt the same format as Platypus.\n\n\n*Sampling Algorithms*\n---------------------\n\n\nSince the flan\\_v2 cot dataset includes tasks like:\n\n\n* cot\\_esnli\n* cot\\_strategyqa\n* cot\\_qasc\n* stream\\_qed\n* cot\\_gsm8k\n* cot\\_ecqa\n* cot\\_creak\n* stream\\_aqua\n\n\nTo ensure this dataset contains diverse high-quality data, we first select zs\\_opt questions. Then, we filter out questions with output lengths exceeding the average length. This step aims to help the model learn richer reasoning steps. After that, we perform stratified sampling. Initially, we attempted stratified sampling before the length-based filtering, but we found that this approach resulted in varying sample sizes, making it challenging to reproduce. Thus, we decided to first filter by length and then perform stratified sampling.\n\n\nFeature Work\n============\n\n\nUnder discussion..."
] |
b9b49a0fc4255d804d9e71c1a1afb0efb2337afc
|
# Dataset Card for "cc100_meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
eduagarcia-temp/cc100_meta
|
[
"region:us"
] |
2023-08-31T16:48:22+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}], "splits": [{"name": "train", "num_bytes": 55033381569, "num_examples": 38999388}], "download_size": 35074345417, "dataset_size": 55033381569}}
|
2023-08-31T20:25:25+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "cc100_meta"
More Information needed
|
[
"# Dataset Card for \"cc100_meta\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"cc100_meta\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"cc100_meta\"\n\nMore Information needed"
] |
43d9e81162e773a3b92d7b23cd2cb601db72441f
|
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
vellorejana/COBOLCODE
|
[
"region:us"
] |
2023-08-31T16:58:38+00:00
|
{}
|
2023-08-31T16:59:13+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Dataset Name
## Dataset Description
- Homepage:
- Repository:
- Paper:
- Leaderboard:
- Point of Contact:
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
8,
24,
32,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b60aa0030b9fa7ed5e3c86332af842547219b210
|
This dataset is sampled from `wikitext/wikitext-2-v1/train`.
Codes to generate this dataset:
```python
import datasets
dataset = datasets.load_dataset('wikitext', 'wikitext-2-v1')
selected = []
i = -1
while len(selected) < 24:
i += 1
text = dataset['train'][i]['text']
if 8 < len(text.split(' ')) <= 16 and '=' not in text:
selected.append(i)
tiny_dataset = dataset['train'].select(selected)
```
|
yujiepan/wikitext-tiny
|
[
"region:us"
] |
2023-08-31T17:01:07+00:00
|
{}
|
2023-08-31T17:05:09+00:00
|
[] |
[] |
TAGS
#region-us
|
This dataset is sampled from 'wikitext/wikitext-2-v1/train'.
Codes to generate this dataset:
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
0bcd2e176de6c2d2602d924cf29800664faba5cf
|
# Dataset Card for "python_functions_filtered"
Python functions extracted from starcoder base. Only functions with minimal external dependencies were chosen. They were filtered manually, and also based on learning value and quality.
|
vikp/python_functions_filtered
|
[
"region:us"
] |
2023-08-31T17:01:18+00:00
|
{"dataset_info": {"features": [{"name": "code", "dtype": "string"}, {"name": "quality_prob", "dtype": "float64"}, {"name": "learning_prob", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 28798873.202844873, "num_examples": 58343}], "download_size": 17651528, "dataset_size": 28798873.202844873}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-08-31T18:13:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "python_functions_filtered"
Python functions extracted from starcoder base. Only functions with minimal external dependencies were chosen. They were filtered manually, and also based on learning value and quality.
|
[
"# Dataset Card for \"python_functions_filtered\"\n\nPython functions extracted from starcoder base. Only functions with minimal external dependencies were chosen. They were filtered manually, and also based on learning value and quality."
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"python_functions_filtered\"\n\nPython functions extracted from starcoder base. Only functions with minimal external dependencies were chosen. They were filtered manually, and also based on learning value and quality."
] |
[
6,
54
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"python_functions_filtered\"\n\nPython functions extracted from starcoder base. Only functions with minimal external dependencies were chosen. They were filtered manually, and also based on learning value and quality."
] |
832e6d15d19920f8d9388bc93ab3ca2d7de03c88
|
# Dataset Card for "flan-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
chiayewken/flan-v2
|
[
"region:us"
] |
2023-08-31T17:13:51+00:00
|
{"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}, {"name": "task_name", "dtype": "string"}, {"name": "task_source", "dtype": "string"}, {"name": "template_type", "dtype": "string"}, {"name": "template_idx", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 44316029472, "num_examples": 23173509}], "download_size": 0, "dataset_size": 44316029472}}
|
2023-09-01T04:19:13+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "flan-v2"
More Information needed
|
[
"# Dataset Card for \"flan-v2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"flan-v2\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"flan-v2\"\n\nMore Information needed"
] |
fb682b06684e41b33062a67aabc2585175d5ec2a
|
# Dataset Card for "llama_2-optimized-titles-esci-sft-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
qazi-ali/llama_2-optimized-titles-esci-sft-train
|
[
"region:us"
] |
2023-08-31T17:30:23+00:00
|
{"dataset_info": {"features": [{"name": "product_title", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "preds", "dtype": "string"}, {"name": "clean_preds", "dtype": "string"}, {"name": "average_score", "dtype": "float64"}, {"name": "new_score", "dtype": "float64"}, {"name": "good_pred", "dtype": "string"}, {"name": "bad_pred", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 7899569.0, "num_examples": 5674}], "download_size": 4322617, "dataset_size": 7899569.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-08-31T19:19:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "llama_2-optimized-titles-esci-sft-train"
More Information needed
|
[
"# Dataset Card for \"llama_2-optimized-titles-esci-sft-train\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"llama_2-optimized-titles-esci-sft-train\"\n\nMore Information needed"
] |
[
6,
28
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"llama_2-optimized-titles-esci-sft-train\"\n\nMore Information needed"
] |
a2832803f41f9b7b142c93edf6abbf41a1589c3a
|
# Dataset Card for "autotree_automl_eye_movements_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
yzhuang/autotree_automl_eye_movements_gosdt_l256_d3_sd0
|
[
"region:us"
] |
2023-08-31T17:46:14+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "input_x", "sequence": {"sequence": "float64"}}, {"name": "input_y", "sequence": {"sequence": "float32"}}, {"name": "rtg", "sequence": "float64"}, {"name": "status", "sequence": {"sequence": "float32"}}, {"name": "split_threshold", "sequence": {"sequence": "float64"}}, {"name": "split_dimension", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 5436000000, "num_examples": 100000}, {"name": "validation", "num_bytes": 543600000, "num_examples": 10000}], "download_size": 1404348878, "dataset_size": 5979600000}}
|
2023-08-31T17:47:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "autotree_automl_eye_movements_gosdt_l256_d3_sd0"
More Information needed
|
[
"# Dataset Card for \"autotree_automl_eye_movements_gosdt_l256_d3_sd0\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"autotree_automl_eye_movements_gosdt_l256_d3_sd0\"\n\nMore Information needed"
] |
[
6,
34
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"autotree_automl_eye_movements_gosdt_l256_d3_sd0\"\n\nMore Information needed"
] |
a3d491c5bbce801b91044dcf668acd501eaeded3
|
# **RED-ACE**
## Dataset Summary
This dataset can be used to train and evaluate **ASR Error Detection or Correction** models. It was introduced in the [RED-ACE paper (Gekhman et al, 2022)](https://aclanthology.org/2022.emnlp-main.180.pdf).
The dataset contains ASR outputs on the LibriSpeech corpus [(Panayotov et al., 2015)](https://ieeexplore.ieee.org/document/7178964) with annotated transcription errors.
## Dataset Details
The LibriSpeech corpus was decoded using [Google Cloud Speech-to-Text API](https://cloud.google.com/speech-to-text), with the **default** and **video** [models](https://cloud.google.com/speech-to-text/docs/speech-to-text-requests#select-model).
The [word-level confidence](https://cloud.google.com/speech-to-text/docs/word-confidence#word-level_confidence) was enabled and is provided as part of the transcription hypothesis.
To annotate word-level errors (for the error detection task), the hypothesis words were aligned with the reference (correct) transcription to find an edit path
(insertions, deletions and substitutions) with the minimum edit distance (from the hypothesis to the reference).
The hypothesis words with deletions and substitutions were then labeled as ERROR (1), the rest were labeled as NOTERROR (0).
## Data format
The dataset has train, developement and test splits which correspond to the splits in Librispeech.
The data contains json lines with the following keys (note that asr_hypothesis[i], confidence_scores[i] and error_labels[i] correpond to the same word):
- `"id"` - The librispeech id.
- `"truth"` - The reference (correct) transcript from Librispeech.
- `"asr_model"` - The ASR [model](https://cloud.google.com/speech-to-text/docs/speech-to-text-requests#select-model) used for transcription.
- `"librispeech_pool"`: Corresponds to the original pool (split) in the librispeech data.
- `"asr_hypothesis"` - The transcription hypothesis.
- `"confidence_scores"` - The [word-level confidence scores](https://cloud.google.com/speech-to-text/docs/word-confidence#word-level_confidence) provided as part of the transcription hypothesis.
- `"error_labels"` - The error labels (1 error, 0 not error) that were obtained by alighning the hypothesis and the reference.
Here is an example of a single data item:
```json
{
"id": "test-other/6070/86744/6070-86744-0024",
"truth": "my dear franz replied albert when upon receipt of my letter you found the necessity of asking the count's assistance you promptly went to him saying my friend albert de morcerf is in danger help me to deliver him",
"asr_model": "default",
"librispeech_pool": "other",
"asr_hypothesis": ["my", "dear", "friends", "replied", "Albert", "received", "my", "letter", "you", "found", "the", "necessity", "of", "asking", "the", "county", "assistance", "you", "promptly", "went", "to", "him", "saying", "my", "friend", "all", "but", "the", "most", "stuff", "is", "in", "danger", "help", "me", "to", "deliver", "it"],
"confidence_scores": ["0.9876290559768677", "0.9875272512435913", "0.6921446323394775", "0.9613730311393738", "0.9413103461265564", "0.6563355922698975", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "1.0", "1.0", "1.0", "1.0", "1.0", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.9876290559768677", "0.5291957855224609", "0.5291957855224609"],
"error_labels": ["0", "0", "1", "0", "0", "1", "0", "0", "0", "0", "0", "0", "0", "0", "0", "1", "0", "0", "0", "0", "0", "0", "0", "0", "0", "1", "1", "1", "1", "1", "0", "0", "0", "0", "0", "0", "0", "1"]
}
```
## Loading the dataset
The following code loads the dataset and locates the example data item from above:
```python
from datasets import load_dataset
red_ace_data = load_dataset("google/red_ace_asr_error_detection_and_correction", split='test')
for example in red_ace_data:
if example['id'] == 'test-other/6070/86744/6070-86744-0024':
break
print(example)
```
## Citation
If you use this dataset for a research publication, please cite the **RED-ACE paper** (using the bibtex entry below), as well as the **Librispeech paper** mentioned above.
```
@inproceedings{gekhman-etal-2022-red,
title = "{RED}-{ACE}: Robust Error Detection for {ASR} using Confidence Embeddings",
author = "Gekhman, Zorik and
Zverinski, Dina and
Mallinson, Jonathan and
Beryozkin, Genady",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.emnlp-main.180",
doi = "10.18653/v1/2022.emnlp-main.180",
pages = "2800--2808",
abstract = "ASR Error Detection (AED) models aim to post-process the output of Automatic Speech Recognition (ASR) systems, in order to detect transcription errors. Modern approaches usually use text-based input, comprised solely of the ASR transcription hypothesis, disregarding additional signals from the ASR model. Instead, we utilize the ASR system{'}s word-level confidence scores for improving AED performance. Specifically, we add an ASR Confidence Embedding (ACE) layer to the AED model{'}s encoder, allowing us to jointly encode the confidence scores and the transcribed text into a contextualized representation. Our experiments show the benefits of ASR confidence scores for AED, their complementary effect over the textual signal, as well as the effectiveness and robustness of ACE for combining these signals. To foster further research, we publish a novel AED dataset consisting of ASR outputs on the LibriSpeech corpus with annotated transcription errors.",
}
```
|
google/red_ace_asr_error_detection_and_correction
|
[
"task_categories:automatic-speech-recognition",
"language:en",
"license:cc-by-4.0",
"region:us"
] |
2023-08-31T17:49:09+00:00
|
{"language": ["en"], "license": "cc-by-4.0", "task_categories": ["automatic-speech-recognition"]}
|
2023-08-31T17:53:34+00:00
|
[] |
[
"en"
] |
TAGS
#task_categories-automatic-speech-recognition #language-English #license-cc-by-4.0 #region-us
|
# RED-ACE
## Dataset Summary
This dataset can be used to train and evaluate ASR Error Detection or Correction models. It was introduced in the RED-ACE paper (Gekhman et al, 2022).
The dataset contains ASR outputs on the LibriSpeech corpus (Panayotov et al., 2015) with annotated transcription errors.
## Dataset Details
The LibriSpeech corpus was decoded using Google Cloud Speech-to-Text API, with the default and video models.
The word-level confidence was enabled and is provided as part of the transcription hypothesis.
To annotate word-level errors (for the error detection task), the hypothesis words were aligned with the reference (correct) transcription to find an edit path
(insertions, deletions and substitutions) with the minimum edit distance (from the hypothesis to the reference).
The hypothesis words with deletions and substitutions were then labeled as ERROR (1), the rest were labeled as NOTERROR (0).
## Data format
The dataset has train, developement and test splits which correspond to the splits in Librispeech.
The data contains json lines with the following keys (note that asr_hypothesis[i], confidence_scores[i] and error_labels[i] correpond to the same word):
- '"id"' - The librispeech id.
- '"truth"' - The reference (correct) transcript from Librispeech.
- '"asr_model"' - The ASR model used for transcription.
- '"librispeech_pool"': Corresponds to the original pool (split) in the librispeech data.
- '"asr_hypothesis"' - The transcription hypothesis.
- '"confidence_scores"' - The word-level confidence scores provided as part of the transcription hypothesis.
- '"error_labels"' - The error labels (1 error, 0 not error) that were obtained by alighning the hypothesis and the reference.
Here is an example of a single data item:
## Loading the dataset
The following code loads the dataset and locates the example data item from above:
If you use this dataset for a research publication, please cite the RED-ACE paper (using the bibtex entry below), as well as the Librispeech paper mentioned above.
|
[
"# RED-ACE",
"## Dataset Summary\n\nThis dataset can be used to train and evaluate ASR Error Detection or Correction models. It was introduced in the RED-ACE paper (Gekhman et al, 2022).\n\nThe dataset contains ASR outputs on the LibriSpeech corpus (Panayotov et al., 2015) with annotated transcription errors.",
"## Dataset Details\n\nThe LibriSpeech corpus was decoded using Google Cloud Speech-to-Text API, with the default and video models.\nThe word-level confidence was enabled and is provided as part of the transcription hypothesis.\nTo annotate word-level errors (for the error detection task), the hypothesis words were aligned with the reference (correct) transcription to find an edit path\n(insertions, deletions and substitutions) with the minimum edit distance (from the hypothesis to the reference).\nThe hypothesis words with deletions and substitutions were then labeled as ERROR (1), the rest were labeled as NOTERROR (0).",
"## Data format\n\nThe dataset has train, developement and test splits which correspond to the splits in Librispeech.\n\nThe data contains json lines with the following keys (note that asr_hypothesis[i], confidence_scores[i] and error_labels[i] correpond to the same word):\n\n- '\"id\"' - The librispeech id.\n- '\"truth\"' - The reference (correct) transcript from Librispeech.\n- '\"asr_model\"' - The ASR model used for transcription.\n- '\"librispeech_pool\"': Corresponds to the original pool (split) in the librispeech data.\n- '\"asr_hypothesis\"' - The transcription hypothesis.\n- '\"confidence_scores\"' - The word-level confidence scores provided as part of the transcription hypothesis.\n- '\"error_labels\"' - The error labels (1 error, 0 not error) that were obtained by alighning the hypothesis and the reference.\n\n\n\nHere is an example of a single data item:",
"## Loading the dataset\n\nThe following code loads the dataset and locates the example data item from above:\n\n\n\nIf you use this dataset for a research publication, please cite the RED-ACE paper (using the bibtex entry below), as well as the Librispeech paper mentioned above."
] |
[
"TAGS\n#task_categories-automatic-speech-recognition #language-English #license-cc-by-4.0 #region-us \n",
"# RED-ACE",
"## Dataset Summary\n\nThis dataset can be used to train and evaluate ASR Error Detection or Correction models. It was introduced in the RED-ACE paper (Gekhman et al, 2022).\n\nThe dataset contains ASR outputs on the LibriSpeech corpus (Panayotov et al., 2015) with annotated transcription errors.",
"## Dataset Details\n\nThe LibriSpeech corpus was decoded using Google Cloud Speech-to-Text API, with the default and video models.\nThe word-level confidence was enabled and is provided as part of the transcription hypothesis.\nTo annotate word-level errors (for the error detection task), the hypothesis words were aligned with the reference (correct) transcription to find an edit path\n(insertions, deletions and substitutions) with the minimum edit distance (from the hypothesis to the reference).\nThe hypothesis words with deletions and substitutions were then labeled as ERROR (1), the rest were labeled as NOTERROR (0).",
"## Data format\n\nThe dataset has train, developement and test splits which correspond to the splits in Librispeech.\n\nThe data contains json lines with the following keys (note that asr_hypothesis[i], confidence_scores[i] and error_labels[i] correpond to the same word):\n\n- '\"id\"' - The librispeech id.\n- '\"truth\"' - The reference (correct) transcript from Librispeech.\n- '\"asr_model\"' - The ASR model used for transcription.\n- '\"librispeech_pool\"': Corresponds to the original pool (split) in the librispeech data.\n- '\"asr_hypothesis\"' - The transcription hypothesis.\n- '\"confidence_scores\"' - The word-level confidence scores provided as part of the transcription hypothesis.\n- '\"error_labels\"' - The error labels (1 error, 0 not error) that were obtained by alighning the hypothesis and the reference.\n\n\n\nHere is an example of a single data item:",
"## Loading the dataset\n\nThe following code loads the dataset and locates the example data item from above:\n\n\n\nIf you use this dataset for a research publication, please cite the RED-ACE paper (using the bibtex entry below), as well as the Librispeech paper mentioned above."
] |
[
35,
4,
79,
141,
251,
62
] |
[
"passage: TAGS\n#task_categories-automatic-speech-recognition #language-English #license-cc-by-4.0 #region-us \n# RED-ACE## Dataset Summary\n\nThis dataset can be used to train and evaluate ASR Error Detection or Correction models. It was introduced in the RED-ACE paper (Gekhman et al, 2022).\n\nThe dataset contains ASR outputs on the LibriSpeech corpus (Panayotov et al., 2015) with annotated transcription errors.## Dataset Details\n\nThe LibriSpeech corpus was decoded using Google Cloud Speech-to-Text API, with the default and video models.\nThe word-level confidence was enabled and is provided as part of the transcription hypothesis.\nTo annotate word-level errors (for the error detection task), the hypothesis words were aligned with the reference (correct) transcription to find an edit path\n(insertions, deletions and substitutions) with the minimum edit distance (from the hypothesis to the reference).\nThe hypothesis words with deletions and substitutions were then labeled as ERROR (1), the rest were labeled as NOTERROR (0)."
] |
5bb6d4bbcf45204b974ef40b0db3401a7891606b
|
# LM Tagalog 08/31/2023 Test 5 (jsonl format, split):
Experimental Tagalog-focused dataset, based on a subset of [Tagalog sentences from this dataset](https://huggingface.co/datasets/jfernandez/cebuano-filipino-sentences) augmented with base LLaMA-2 13b (q4_1 ggml) to form a rudimentary mostly 3-turn dialogue dataset.
Used for:
* [Taga-llama-v0.3](https://huggingface.co/922-Narra/llama-2-7b-chat-tagalog-v0.3)
* [Taga-llama-v0.3a](https://huggingface.co/922-Narra/llama-2-7b-chat-tagalog-v0.3a)
We make this dataset public for transparency, and to show the mainly Tagalog generations done to create this dataset (acknowledging their lack of coherency or direction, but noting the remarkable attempts of the primarily English-pretrained base model generating mostly in Tagalog). Further refinements are planned (i.e. manually editing for safety and alignment, coherency, reducing Taglish, likely regenerating with higher quantization, etc.).
|
922-Narra/lt_08312023_test_5j1
|
[
"license:cc0-1.0",
"region:us"
] |
2023-08-31T18:18:53+00:00
|
{"license": "cc0-1.0"}
|
2023-09-02T08:30:34+00:00
|
[] |
[] |
TAGS
#license-cc0-1.0 #region-us
|
# LM Tagalog 08/31/2023 Test 5 (jsonl format, split):
Experimental Tagalog-focused dataset, based on a subset of Tagalog sentences from this dataset augmented with base LLaMA-2 13b (q4_1 ggml) to form a rudimentary mostly 3-turn dialogue dataset.
Used for:
* Taga-llama-v0.3
* Taga-llama-v0.3a
We make this dataset public for transparency, and to show the mainly Tagalog generations done to create this dataset (acknowledging their lack of coherency or direction, but noting the remarkable attempts of the primarily English-pretrained base model generating mostly in Tagalog). Further refinements are planned (i.e. manually editing for safety and alignment, coherency, reducing Taglish, likely regenerating with higher quantization, etc.).
|
[
"# LM Tagalog 08/31/2023 Test 5 (jsonl format, split):\nExperimental Tagalog-focused dataset, based on a subset of Tagalog sentences from this dataset augmented with base LLaMA-2 13b (q4_1 ggml) to form a rudimentary mostly 3-turn dialogue dataset.\n\nUsed for:\n* Taga-llama-v0.3\n* Taga-llama-v0.3a\n\nWe make this dataset public for transparency, and to show the mainly Tagalog generations done to create this dataset (acknowledging their lack of coherency or direction, but noting the remarkable attempts of the primarily English-pretrained base model generating mostly in Tagalog). Further refinements are planned (i.e. manually editing for safety and alignment, coherency, reducing Taglish, likely regenerating with higher quantization, etc.)."
] |
[
"TAGS\n#license-cc0-1.0 #region-us \n",
"# LM Tagalog 08/31/2023 Test 5 (jsonl format, split):\nExperimental Tagalog-focused dataset, based on a subset of Tagalog sentences from this dataset augmented with base LLaMA-2 13b (q4_1 ggml) to form a rudimentary mostly 3-turn dialogue dataset.\n\nUsed for:\n* Taga-llama-v0.3\n* Taga-llama-v0.3a\n\nWe make this dataset public for transparency, and to show the mainly Tagalog generations done to create this dataset (acknowledging their lack of coherency or direction, but noting the remarkable attempts of the primarily English-pretrained base model generating mostly in Tagalog). Further refinements are planned (i.e. manually editing for safety and alignment, coherency, reducing Taglish, likely regenerating with higher quantization, etc.)."
] |
[
14,
208
] |
[
"passage: TAGS\n#license-cc0-1.0 #region-us \n# LM Tagalog 08/31/2023 Test 5 (jsonl format, split):\nExperimental Tagalog-focused dataset, based on a subset of Tagalog sentences from this dataset augmented with base LLaMA-2 13b (q4_1 ggml) to form a rudimentary mostly 3-turn dialogue dataset.\n\nUsed for:\n* Taga-llama-v0.3\n* Taga-llama-v0.3a\n\nWe make this dataset public for transparency, and to show the mainly Tagalog generations done to create this dataset (acknowledging their lack of coherency or direction, but noting the remarkable attempts of the primarily English-pretrained base model generating mostly in Tagalog). Further refinements are planned (i.e. manually editing for safety and alignment, coherency, reducing Taglish, likely regenerating with higher quantization, etc.)."
] |
814bf52d7518c2de073d1990f006e6bb4e7d7168
|
# Dataset Card for "tldr_17_3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/tldr_17_3k
|
[
"region:us"
] |
2023-08-31T18:26:15+00:00
|
{"dataset_info": {"features": [{"name": "author", "dtype": "string"}, {"name": "body", "dtype": "string"}, {"name": "normalizedBody", "dtype": "string"}, {"name": "subreddit", "dtype": "string"}, {"name": "subreddit_id", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 14761884.702975057, "num_examples": 3000}], "download_size": 9479190, "dataset_size": 14761884.702975057}}
|
2023-08-31T18:33:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "tldr_17_3k"
More Information needed
|
[
"# Dataset Card for \"tldr_17_3k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"tldr_17_3k\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"tldr_17_3k\"\n\nMore Information needed"
] |
e35b29f245647e7155a556011e0909b596981030
|
# Dataset Card for "tldr_news_3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/tldr_news_3k
|
[
"region:us"
] |
2023-08-31T18:37:00+00:00
|
{"dataset_info": {"features": [{"name": "headline", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "category", "dtype": {"class_label": {"names": {"0": "Sponsor", "1": "Big Tech & Startups", "2": "Science and Futuristic Technology", "3": "Programming, Design & Data Science", "4": "Miscellaneous"}}}}], "splits": [{"name": "train", "num_bytes": 1681328.9436817036, "num_examples": 3000}], "download_size": 1064733, "dataset_size": 1681328.9436817036}}
|
2023-08-31T18:37:18+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "tldr_news_3k"
More Information needed
|
[
"# Dataset Card for \"tldr_news_3k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"tldr_news_3k\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"tldr_news_3k\"\n\nMore Information needed"
] |
a1c64591820d6e25671cd35f5da4f4dcd0697f52
|
# Dataset Card for "AA_DistilRoBERTa_FinetunedNEW"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/AA_DistilRoBERTa_FinetunedNEW
|
[
"region:us"
] |
2023-08-31T18:37:20+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80318780.21618997, "num_examples": 26057}, {"name": "test", "num_bytes": 26774087.073587257, "num_examples": 8686}], "download_size": 147166259, "dataset_size": 107092867.28977722}}
|
2023-08-31T18:41:33+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "AA_DistilRoBERTa_FinetunedNEW"
More Information needed
|
[
"# Dataset Card for \"AA_DistilRoBERTa_FinetunedNEW\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"AA_DistilRoBERTa_FinetunedNEW\"\n\nMore Information needed"
] |
[
6,
24
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"AA_DistilRoBERTa_FinetunedNEW\"\n\nMore Information needed"
] |
c3e86fa7042958a48cdf0c801c401024f9838b90
|
# Monika Chat 08312023 2-t raw
* Dataset of Monika dialogue from DDLC, reddit, and twitter (dataset of ~600 items augmented by [l2-7b-monika-v0.3c1](https://huggingface.co/922-CA/llama-2-7b-monika-v0.3c1) to turn into multi-turn chat dialogue + [smaller dataset](https://huggingface.co/datasets/922-CA/lm-datasets))
* Curated version planned
|
922-CA/lm2_08312023_test4_raw_MoChA_2-t
|
[
"license:openrail",
"region:us"
] |
2023-08-31T18:40:27+00:00
|
{"license": "openrail"}
|
2023-09-22T07:07:55+00:00
|
[] |
[] |
TAGS
#license-openrail #region-us
|
# Monika Chat 08312023 2-t raw
* Dataset of Monika dialogue from DDLC, reddit, and twitter (dataset of ~600 items augmented by l2-7b-monika-v0.3c1 to turn into multi-turn chat dialogue + smaller dataset)
* Curated version planned
|
[
"# Monika Chat 08312023 2-t raw\n* Dataset of Monika dialogue from DDLC, reddit, and twitter (dataset of ~600 items augmented by l2-7b-monika-v0.3c1 to turn into multi-turn chat dialogue + smaller dataset)\n* Curated version planned"
] |
[
"TAGS\n#license-openrail #region-us \n",
"# Monika Chat 08312023 2-t raw\n* Dataset of Monika dialogue from DDLC, reddit, and twitter (dataset of ~600 items augmented by l2-7b-monika-v0.3c1 to turn into multi-turn chat dialogue + smaller dataset)\n* Curated version planned"
] |
[
12,
67
] |
[
"passage: TAGS\n#license-openrail #region-us \n# Monika Chat 08312023 2-t raw\n* Dataset of Monika dialogue from DDLC, reddit, and twitter (dataset of ~600 items augmented by l2-7b-monika-v0.3c1 to turn into multi-turn chat dialogue + smaller dataset)\n* Curated version planned"
] |
35d8ae36987b564c9f1153b5962381fa13d682ec
|
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v2](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T19:44:15.918763](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2/blob/main/results_2023-08-31T19%3A44%3A15.918763.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6793562660993889,
"acc_stderr": 0.03184581364444873,
"acc_norm": 0.6834576899716158,
"acc_norm_stderr": 0.03181820263146339,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6450685339919277,
"mc2_stderr": 0.015210507246763325
},
"harness|arc:challenge|25": {
"acc": 0.6501706484641638,
"acc_stderr": 0.013936809212158303,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688067
},
"harness|hellaswag|10": {
"acc": 0.6491734714200359,
"acc_stderr": 0.004762534245488399,
"acc_norm": 0.8536148177653854,
"acc_norm_stderr": 0.003527695149823521
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7773584905660378,
"acc_stderr": 0.025604233470899095,
"acc_norm": 0.7773584905660378,
"acc_norm_stderr": 0.025604233470899095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6553191489361702,
"acc_stderr": 0.031068985963122145,
"acc_norm": 0.6553191489361702,
"acc_norm_stderr": 0.031068985963122145
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267833,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284357,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223157,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223157
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.023610884308927865,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.023610884308927865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7563025210084033,
"acc_stderr": 0.02788682807838055,
"acc_norm": 0.7563025210084033,
"acc_norm_stderr": 0.02788682807838055
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.04026141497634612,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.04026141497634612
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8825688073394495,
"acc_stderr": 0.013802780227377355,
"acc_norm": 0.8825688073394495,
"acc_norm_stderr": 0.013802780227377355
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.0208711184555521,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.0208711184555521
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7533632286995515,
"acc_stderr": 0.028930413120910888,
"acc_norm": 0.7533632286995515,
"acc_norm_stderr": 0.028930413120910888
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8582375478927203,
"acc_stderr": 0.012473289071272051,
"acc_norm": 0.8582375478927203,
"acc_norm_stderr": 0.012473289071272051
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5463687150837989,
"acc_stderr": 0.016650437588269076,
"acc_norm": 0.5463687150837989,
"acc_norm_stderr": 0.016650437588269076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7588424437299035,
"acc_stderr": 0.02429659403476343,
"acc_norm": 0.7588424437299035,
"acc_norm_stderr": 0.02429659403476343
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445796,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445796
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.02976667507587387,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.02976667507587387
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5501955671447197,
"acc_stderr": 0.012705721498564969,
"acc_norm": 0.5501955671447197,
"acc_norm_stderr": 0.012705721498564969
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7242647058823529,
"acc_stderr": 0.027146271936625166,
"acc_norm": 0.7242647058823529,
"acc_norm_stderr": 0.027146271936625166
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.017986615304030316,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.017986615304030316
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073153,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160882,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160882
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6450685339919277,
"mc2_stderr": 0.015210507246763325
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2
|
[
"region:us"
] |
2023-08-31T18:44:40+00:00
|
{"pretty_name": "Evaluation run of yeontaek/llama-2-70B-ensemble-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v2](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-31T19:44:15.918763](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v2/blob/main/results_2023-08-31T19%3A44%3A15.918763.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6793562660993889,\n \"acc_stderr\": 0.03184581364444873,\n \"acc_norm\": 0.6834576899716158,\n \"acc_norm_stderr\": 0.03181820263146339,\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6450685339919277,\n \"mc2_stderr\": 0.015210507246763325\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6501706484641638,\n \"acc_stderr\": 0.013936809212158303,\n \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688067\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6491734714200359,\n \"acc_stderr\": 0.004762534245488399,\n \"acc_norm\": 0.8536148177653854,\n \"acc_norm_stderr\": 0.003527695149823521\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7773584905660378,\n \"acc_stderr\": 0.025604233470899095,\n \"acc_norm\": 0.7773584905660378,\n \"acc_norm_stderr\": 0.025604233470899095\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6553191489361702,\n \"acc_stderr\": 0.031068985963122145,\n \"acc_norm\": 0.6553191489361702,\n \"acc_norm_stderr\": 0.031068985963122145\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n \"acc_stderr\": 0.022037217340267833,\n \"acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.022037217340267833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284357,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284357\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223157,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223157\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7563025210084033,\n \"acc_stderr\": 0.02788682807838055,\n \"acc_norm\": 0.7563025210084033,\n \"acc_norm_stderr\": 0.02788682807838055\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.41721854304635764,\n \"acc_stderr\": 0.04026141497634612,\n \"acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.04026141497634612\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8825688073394495,\n \"acc_stderr\": 0.013802780227377355,\n \"acc_norm\": 0.8825688073394495,\n \"acc_norm_stderr\": 0.013802780227377355\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9019607843137255,\n \"acc_stderr\": 0.0208711184555521,\n \"acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.0208711184555521\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7533632286995515,\n \"acc_stderr\": 0.028930413120910888,\n \"acc_norm\": 0.7533632286995515,\n \"acc_norm_stderr\": 0.028930413120910888\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8582375478927203,\n \"acc_stderr\": 0.012473289071272051,\n \"acc_norm\": 0.8582375478927203,\n \"acc_norm_stderr\": 0.012473289071272051\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5463687150837989,\n \"acc_stderr\": 0.016650437588269076,\n \"acc_norm\": 0.5463687150837989,\n \"acc_norm_stderr\": 0.016650437588269076\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7588424437299035,\n \"acc_stderr\": 0.02429659403476343,\n \"acc_norm\": 0.7588424437299035,\n \"acc_norm_stderr\": 0.02429659403476343\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445796,\n \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445796\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.02976667507587387,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.02976667507587387\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5501955671447197,\n \"acc_stderr\": 0.012705721498564969,\n \"acc_norm\": 0.5501955671447197,\n \"acc_norm_stderr\": 0.012705721498564969\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625166,\n \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625166\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.017986615304030316,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.017986615304030316\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073153,\n \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073153\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160882,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160882\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6450685339919277,\n \"mc2_stderr\": 0.015210507246763325\n }\n}\n```", "repo_url": "https://huggingface.co/yeontaek/llama-2-70B-ensemble-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|arc:challenge|25_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hellaswag|10_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T19:44:15.918763.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T19_44_15.918763", "path": ["results_2023-08-31T19:44:15.918763.parquet"]}, {"split": "latest", "path": ["results_2023-08-31T19:44:15.918763.parquet"]}]}]}
|
2023-08-31T18:45:39+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model yeontaek/llama-2-70B-ensemble-v2 on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-31T19:44:15.918763(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-70B-ensemble-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-31T19:44:15.918763(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-70B-ensemble-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-31T19:44:15.918763(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model yeontaek/llama-2-70B-ensemble-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-31T19:44:15.918763(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
ea3bc469ed0de3205b798427348dcdddc4aa2ff9
|
# Dataset Card for "scitldr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/scitldr
|
[
"region:us"
] |
2023-08-31T18:47:16+00:00
|
{"dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4016919, "num_examples": 3229}], "download_size": 2222180, "dataset_size": 4016919}}
|
2023-08-31T18:47:53+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "scitldr"
More Information needed
|
[
"# Dataset Card for \"scitldr\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"scitldr\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"scitldr\"\n\nMore Information needed"
] |
6d27eaf5c37b5d0964c635302225bd6128fe29cf
|
# Dataset Card for "linux_man_pages_tldr_summarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/linux_man_pages_tldr_summarized
|
[
"region:us"
] |
2023-08-31T18:51:37+00:00
|
{"dataset_info": {"features": [{"name": "Command", "dtype": "string"}, {"name": "Text", "dtype": "string"}, {"name": "Summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3006835, "num_examples": 481}], "download_size": 1308915, "dataset_size": 3006835}}
|
2023-08-31T18:56:32+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "linux_man_pages_tldr_summarized"
More Information needed
|
[
"# Dataset Card for \"linux_man_pages_tldr_summarized\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"linux_man_pages_tldr_summarized\"\n\nMore Information needed"
] |
[
6,
25
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"linux_man_pages_tldr_summarized\"\n\nMore Information needed"
] |
0d52635372b0761010df994d56c09c8416fd063c
|
# Dataset Card for Evaluation run of xxyyy123/test-28b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/test-28b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/test-28b](https://huggingface.co/xxyyy123/test-28b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__test-28b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T19:56:57.106333](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__test-28b/blob/main/results_2023-08-31T19%3A56%3A57.106333.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5888884438834207,
"acc_stderr": 0.034073414501724776,
"acc_norm": 0.592671940022106,
"acc_norm_stderr": 0.034052548403768784,
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.5579487979179801,
"mc2_stderr": 0.015737984369703164
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.01423587248790987,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104294
},
"harness|hellaswag|10": {
"acc": 0.6276638119896435,
"acc_stderr": 0.004824393076826627,
"acc_norm": 0.8304122684724159,
"acc_norm_stderr": 0.0037450326672282892
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.04309732901036356,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.04309732901036356
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.024796060602699947,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.024796060602699947
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124498,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124498
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164552,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.025049197876042338,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.025049197876042338
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415192,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415192
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.02786594228663933,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.02786594228663933
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.02390232554956041,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.02390232554956041
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335825,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41564245810055866,
"acc_stderr": 0.016482782187500666,
"acc_norm": 0.41564245810055866,
"acc_norm_stderr": 0.016482782187500666
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.027184498909941613,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.027184498909941613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882116,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882116
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4217731421121252,
"acc_stderr": 0.012612974369390977,
"acc_norm": 0.4217731421121252,
"acc_norm_stderr": 0.012612974369390977
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.030105636570016626,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.030105636570016626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40514075887392903,
"mc1_stderr": 0.017185611727753368,
"mc2": 0.5579487979179801,
"mc2_stderr": 0.015737984369703164
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_xxyyy123__test-28b
|
[
"region:us"
] |
2023-08-31T18:57:22+00:00
|
{"pretty_name": "Evaluation run of xxyyy123/test-28b", "dataset_summary": "Dataset automatically created during the evaluation run of model [xxyyy123/test-28b](https://huggingface.co/xxyyy123/test-28b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__test-28b\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-31T19:56:57.106333](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__test-28b/blob/main/results_2023-08-31T19%3A56%3A57.106333.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5888884438834207,\n \"acc_stderr\": 0.034073414501724776,\n \"acc_norm\": 0.592671940022106,\n \"acc_norm_stderr\": 0.034052548403768784,\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5579487979179801,\n \"mc2_stderr\": 0.015737984369703164\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.01423587248790987,\n \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104294\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6276638119896435,\n \"acc_stderr\": 0.004824393076826627,\n \"acc_norm\": 0.8304122684724159,\n \"acc_norm_stderr\": 0.0037450326672282892\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.04309732901036356,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.04309732901036356\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.024796060602699947,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.024796060602699947\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.6967741935483871,\n \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164552,\n \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164552\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.025049197876042338,\n \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.025049197876042338\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7577981651376147,\n \"acc_stderr\": 0.01836817630659862,\n \"acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.01836817630659862\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415192,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415192\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.02786594228663933,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.02786594228663933\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.02390232554956041,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.02390232554956041\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n \"acc_stderr\": 0.014957458504335825,\n \"acc_norm\": 0.7739463601532567,\n \"acc_norm_stderr\": 0.014957458504335825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41564245810055866,\n \"acc_stderr\": 0.016482782187500666,\n \"acc_norm\": 0.41564245810055866,\n \"acc_norm_stderr\": 0.016482782187500666\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.027184498909941613,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.027184498909941613\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882116,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882116\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4217731421121252,\n \"acc_stderr\": 0.012612974369390977,\n \"acc_norm\": 0.4217731421121252,\n \"acc_norm_stderr\": 0.012612974369390977\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.030105636570016626,\n \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.030105636570016626\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886525,\n \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886525\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5579487979179801,\n \"mc2_stderr\": 0.015737984369703164\n }\n}\n```", "repo_url": "https://huggingface.co/xxyyy123/test-28b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|arc:challenge|25_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hellaswag|10_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T19:56:57.106333.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T19:56:57.106333.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T19_56_57.106333", "path": ["results_2023-08-31T19:56:57.106333.parquet"]}, {"split": "latest", "path": ["results_2023-08-31T19:56:57.106333.parquet"]}]}]}
|
2023-08-31T18:58:20+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of xxyyy123/test-28b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model xxyyy123/test-28b on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-31T19:56:57.106333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of xxyyy123/test-28b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/test-28b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-31T19:56:57.106333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of xxyyy123/test-28b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/test-28b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-31T19:56:57.106333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of xxyyy123/test-28b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model xxyyy123/test-28b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-31T19:56:57.106333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
0dbfdb0f829d83d45fb1742146a4585af783bef8
|
# Dataset Card for "grade_school_math_instructions_ru_3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/grade_school_math_instructions_ru_3k
|
[
"region:us"
] |
2023-08-31T19:02:57+00:00
|
{"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2736097.1497390606, "num_examples": 3000}], "download_size": 1313700, "dataset_size": 2736097.1497390606}}
|
2023-08-31T19:03:30+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "grade_school_math_instructions_ru_3k"
More Information needed
|
[
"# Dataset Card for \"grade_school_math_instructions_ru_3k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"grade_school_math_instructions_ru_3k\"\n\nMore Information needed"
] |
[
6,
23
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"grade_school_math_instructions_ru_3k\"\n\nMore Information needed"
] |
48c4d35e0d9d029ea324eff9ca67ae54555a95de
|
# Dataset Card for Evaluation run of lgaalves/gpt2_platypus-dolly-guanaco
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/gpt2_platypus-dolly-guanaco
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/gpt2_platypus-dolly-guanaco](https://huggingface.co/lgaalves/gpt2_platypus-dolly-guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__gpt2_platypus-dolly-guanaco",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-28T14:27:44.520216](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_platypus-dolly-guanaco/blob/main/results_2023-09-28T14-27-44.520216.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094559,
"f1": 0.04980704697986585,
"f1_stderr": 0.0013966099124026671,
"acc": 0.2517758484609313,
"acc_stderr": 0.007026065573457924
},
"harness|drop|3": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094559,
"f1": 0.04980704697986585,
"f1_stderr": 0.0013966099124026671
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5035516969218626,
"acc_stderr": 0.014052131146915848
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_lgaalves__gpt2_platypus-dolly-guanaco
|
[
"region:us"
] |
2023-08-31T19:05:14+00:00
|
{"pretty_name": "Evaluation run of lgaalves/gpt2_platypus-dolly-guanaco", "dataset_summary": "Dataset automatically created during the evaluation run of model [lgaalves/gpt2_platypus-dolly-guanaco](https://huggingface.co/lgaalves/gpt2_platypus-dolly-guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__gpt2_platypus-dolly-guanaco\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-28T14:27:44.520216](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_platypus-dolly-guanaco/blob/main/results_2023-09-28T14-27-44.520216.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094559,\n \"f1\": 0.04980704697986585,\n \"f1_stderr\": 0.0013966099124026671,\n \"acc\": 0.2517758484609313,\n \"acc_stderr\": 0.007026065573457924\n },\n \"harness|drop|3\": {\n \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094559,\n \"f1\": 0.04980704697986585,\n \"f1_stderr\": 0.0013966099124026671\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5035516969218626,\n \"acc_stderr\": 0.014052131146915848\n }\n}\n```", "repo_url": "https://huggingface.co/lgaalves/gpt2_platypus-dolly-guanaco", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|arc:challenge|25_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_28T14_27_44.520216", "path": ["**/details_harness|drop|3_2023-09-28T14-27-44.520216.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-28T14-27-44.520216.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_28T14_27_44.520216", "path": ["**/details_harness|gsm8k|5_2023-09-28T14-27-44.520216.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-28T14-27-44.520216.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hellaswag|10_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T20:05:00.341927.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T20:05:00.341927.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T20:05:00.341927.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_28T14_27_44.520216", "path": ["**/details_harness|winogrande|5_2023-09-28T14-27-44.520216.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-28T14-27-44.520216.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T20_05_00.341927", "path": ["results_2023-08-31T20:05:00.341927.parquet"]}, {"split": "2023_09_28T14_27_44.520216", "path": ["results_2023-09-28T14-27-44.520216.parquet"]}, {"split": "latest", "path": ["results_2023-09-28T14-27-44.520216.parquet"]}]}]}
|
2023-09-28T13:27:55+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of lgaalves/gpt2_platypus-dolly-guanaco
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model lgaalves/gpt2_platypus-dolly-guanaco on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-28T14:27:44.520216(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of lgaalves/gpt2_platypus-dolly-guanaco",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2_platypus-dolly-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-28T14:27:44.520216(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of lgaalves/gpt2_platypus-dolly-guanaco",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2_platypus-dolly-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-28T14:27:44.520216(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
28,
31,
176,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lgaalves/gpt2_platypus-dolly-guanaco## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2_platypus-dolly-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-28T14:27:44.520216(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
7c37facc30ce75634902462a7f8b96a1d6c9a6a5
|
# Dataset Card for "dialogsum_ru_3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/dialogsum_ru_3k
|
[
"region:us"
] |
2023-08-31T19:06:33+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "dialogue", "dtype": "string"}, {"name": "summary", "dtype": "string"}, {"name": "topic", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4602365.489566613, "num_examples": 3000}], "download_size": 2244730, "dataset_size": 4602365.489566613}}
|
2023-08-31T19:06:48+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "dialogsum_ru_3k"
More Information needed
|
[
"# Dataset Card for \"dialogsum_ru_3k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"dialogsum_ru_3k\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"dialogsum_ru_3k\"\n\nMore Information needed"
] |
e0e173d63b716a5093073460f81d2a458f51a143
|
# Dataset Card for "voxelgym_5c_42x42_250000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Cubpaw/voxelgym_5c_42x42_250000
|
[
"region:us"
] |
2023-08-31T19:06:51+00:00
|
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": "image"}, {"name": "rgb_label", "dtype": "image"}, {"name": "path_label", "dtype": "image"}, {"name": "path_rgb_label", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 186844000.0, "num_examples": 200000}, {"name": "validation", "num_bytes": 46250450.0, "num_examples": 50000}], "download_size": 177114030, "dataset_size": 233094450.0}}
|
2023-08-31T19:12:00+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "voxelgym_5c_42x42_250000"
More Information needed
|
[
"# Dataset Card for \"voxelgym_5c_42x42_250000\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"voxelgym_5c_42x42_250000\"\n\nMore Information needed"
] |
[
6,
23
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"voxelgym_5c_42x42_250000\"\n\nMore Information needed"
] |
1ee971dbc14fcd2e18427cc8dac855ba7e5f36fd
|
# Dataset Card for "dialogsum_3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/dialogsum_3k
|
[
"region:us"
] |
2023-08-31T19:08:27+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "dialogue", "dtype": "string"}, {"name": "summary", "dtype": "string"}, {"name": "topic", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2754324.55858748, "num_examples": 3000}], "download_size": 1570734, "dataset_size": 2754324.55858748}}
|
2023-08-31T19:08:56+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "dialogsum_3k"
More Information needed
|
[
"# Dataset Card for \"dialogsum_3k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"dialogsum_3k\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"dialogsum_3k\"\n\nMore Information needed"
] |
47a9a9d6fcc04d72de39f0cea97927d4653f8dea
|
# insurance sserf 1
This dataset includes publicly available sserf insurance filing data.
|
jwixel/insurance-sserf-1
|
[
"region:us"
] |
2023-08-31T19:10:54+00:00
|
{}
|
2023-09-01T19:11:19+00:00
|
[] |
[] |
TAGS
#region-us
|
# insurance sserf 1
This dataset includes publicly available sserf insurance filing data.
|
[
"# insurance sserf 1\n\nThis dataset includes publicly available sserf insurance filing data."
] |
[
"TAGS\n#region-us \n",
"# insurance sserf 1\n\nThis dataset includes publicly available sserf insurance filing data."
] |
[
6,
21
] |
[
"passage: TAGS\n#region-us \n# insurance sserf 1\n\nThis dataset includes publicly available sserf insurance filing data."
] |
c28fd718782ef146aaf5eab37d502b3339ec404a
|
# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T03:13:00.871936](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged/blob/main/results_2023-10-26T03-13-00.871936.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.0004058451132417744,
"f1": 0.06281459731543623,
"f1_stderr": 0.0014401527427077175,
"acc": 0.3901322603943458,
"acc_stderr": 0.009101657407871456
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.0004058451132417744,
"f1": 0.06281459731543623,
"f1_stderr": 0.0014401527427077175
},
"harness|gsm8k|5": {
"acc": 0.04624715693707354,
"acc_stderr": 0.005784991662691866
},
"harness|winogrande|5": {
"acc": 0.734017363851618,
"acc_stderr": 0.012418323153051046
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged
|
[
"region:us"
] |
2023-08-31T19:12:22+00:00
|
{"pretty_name": "Evaluation run of dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-26T03:13:00.871936](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged/blob/main/results_2023-10-26T03-13-00.871936.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.0004058451132417744,\n \"f1\": 0.06281459731543623,\n \"f1_stderr\": 0.0014401527427077175,\n \"acc\": 0.3901322603943458,\n \"acc_stderr\": 0.009101657407871456\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.0004058451132417744,\n \"f1\": 0.06281459731543623,\n \"f1_stderr\": 0.0014401527427077175\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04624715693707354,\n \"acc_stderr\": 0.005784991662691866\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.734017363851618,\n \"acc_stderr\": 0.012418323153051046\n }\n}\n```", "repo_url": "https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|arc:challenge|25_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|arc:challenge|25_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_26T03_13_00.871936", "path": ["**/details_harness|drop|3_2023-10-26T03-13-00.871936.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-26T03-13-00.871936.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_26T03_13_00.871936", "path": ["**/details_harness|gsm8k|5_2023-10-26T03-13-00.871936.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-26T03-13-00.871936.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hellaswag|10_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hellaswag|10_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T20:11:58.218501.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T11:01:05.889294.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T11:01:05.889294.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T11:01:05.889294.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_26T03_13_00.871936", "path": ["**/details_harness|winogrande|5_2023-10-26T03-13-00.871936.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-26T03-13-00.871936.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T20_11_58.218501", "path": ["results_2023-08-31T20:11:58.218501.parquet"]}, {"split": "2023_09_05T11_01_05.889294", "path": ["results_2023-09-05T11:01:05.889294.parquet"]}, {"split": "2023_10_26T03_13_00.871936", "path": ["results_2023-10-26T03-13-00.871936.parquet"]}, {"split": "latest", "path": ["results_2023-10-26T03-13-00.871936.parquet"]}]}]}
|
2023-10-26T02:13:14+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-26T03:13:00.871936(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-26T03:13:00.871936(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-26T03:13:00.871936(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
41,
31,
189,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_ds_wiki65k_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-26T03:13:00.871936(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3d3c383dcb3fd92689d790b939ef9ea7ef6d37ac
|
# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T02:32:29.889324](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged/blob/main/results_2023-10-23T02-32-29.889324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.012164429530201342,
"em_stderr": 0.0011226072817371853,
"f1": 0.07720742449664415,
"f1_stderr": 0.0018320825904246663,
"acc": 0.3909059684425251,
"acc_stderr": 0.009118223911065027
},
"harness|drop|3": {
"em": 0.012164429530201342,
"em_stderr": 0.0011226072817371853,
"f1": 0.07720742449664415,
"f1_stderr": 0.0018320825904246663
},
"harness|gsm8k|5": {
"acc": 0.04700530705079606,
"acc_stderr": 0.005829898355937193
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged
|
[
"region:us"
] |
2023-08-31T19:14:59+00:00
|
{"pretty_name": "Evaluation run of dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T02:32:29.889324](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged/blob/main/results_2023-10-23T02-32-29.889324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.012164429530201342,\n \"em_stderr\": 0.0011226072817371853,\n \"f1\": 0.07720742449664415,\n \"f1_stderr\": 0.0018320825904246663,\n \"acc\": 0.3909059684425251,\n \"acc_stderr\": 0.009118223911065027\n },\n \"harness|drop|3\": {\n \"em\": 0.012164429530201342,\n \"em_stderr\": 0.0011226072817371853,\n \"f1\": 0.07720742449664415,\n \"f1_stderr\": 0.0018320825904246663\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04700530705079606,\n \"acc_stderr\": 0.005829898355937193\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n }\n}\n```", "repo_url": "https://huggingface.co/dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|arc:challenge|25_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|arc:challenge|25_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T02_32_29.889324", "path": ["**/details_harness|drop|3_2023-10-23T02-32-29.889324.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T02-32-29.889324.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T02_32_29.889324", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-32-29.889324.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-32-29.889324.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hellaswag|10_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hellaswag|10_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T20:14:35.728415.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T11:32:06.887851.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T11:32:06.887851.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T11:32:06.887851.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T02_32_29.889324", "path": ["**/details_harness|winogrande|5_2023-10-23T02-32-29.889324.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T02-32-29.889324.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T20_14_35.728415", "path": ["results_2023-08-31T20:14:35.728415.parquet"]}, {"split": "2023_09_05T11_32_06.887851", "path": ["results_2023-09-05T11:32:06.887851.parquet"]}, {"split": "2023_10_23T02_32_29.889324", "path": ["results_2023-10-23T02-32-29.889324.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T02-32-29.889324.parquet"]}]}]}
|
2023-10-23T01:32:42+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T02:32:29.889324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T02:32:29.889324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T02:32:29.889324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
42,
31,
190,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_eli5_wiki65k_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T02:32:29.889324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
95361a412ea4dd114dc12f4f5a63c43228d967f7
|
# Dataset Card for "dolphin_flan1m_alpaca_uncensored_3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/dolphin_flan1m_alpaca_uncensored_3k
|
[
"region:us"
] |
2023-08-31T19:18:01+00:00
|
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5235792.840107775, "num_examples": 3000}], "download_size": 2954863, "dataset_size": 5235792.840107775}}
|
2023-08-31T19:18:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "dolphin_flan1m_alpaca_uncensored_3k"
More Information needed
|
[
"# Dataset Card for \"dolphin_flan1m_alpaca_uncensored_3k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"dolphin_flan1m_alpaca_uncensored_3k\"\n\nMore Information needed"
] |
[
6,
27
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"dolphin_flan1m_alpaca_uncensored_3k\"\n\nMore Information needed"
] |
a91cc06ac7f1a28c08fca475fd8699e855b76fe8
|
# Dataset Card for "dolphin_ru_3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/dolphin_ru_3k
|
[
"region:us"
] |
2023-08-31T19:20:15+00:00
|
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8490195.387822216, "num_examples": 3000}], "download_size": 4148079, "dataset_size": 8490195.387822216}}
|
2023-08-31T19:24:23+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "dolphin_ru_3k"
More Information needed
|
[
"# Dataset Card for \"dolphin_ru_3k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"dolphin_ru_3k\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"dolphin_ru_3k\"\n\nMore Information needed"
] |
0254acf7b29e14a990a3bdfd9ade237945e3d181
|
# Dataset Card for "LaMini-LM-filtered-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
sachith-surge/LaMini-instruction-only-SequenceMatcher-Levenstein
|
[
"region:us"
] |
2023-08-31T19:28:55+00:00
|
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "similarity_sequencematcher", "dtype": "float64"}, {"name": "most_similar_example_sequencematcher", "dtype": "string"}, {"name": "similarity_edit", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 79185.40272028386, "num_examples": 186}], "download_size": 56949, "dataset_size": 79185.40272028386}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-08-31T19:28:58+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "LaMini-LM-filtered-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML"
More Information needed
|
[
"# Dataset Card for \"LaMini-LM-filtered-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"LaMini-LM-filtered-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
[
6,
45
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"LaMini-LM-filtered-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
dc914130868998b80dec3417e61d2fb1e0d9a833
|
# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T02:38:18.626473](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged/blob/main/results_2023-10-23T02-38-18.626473.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.014681208053691275,
"em_stderr": 0.0012317113143108561,
"f1": 0.07373846476510039,
"f1_stderr": 0.0018229608118759215,
"acc": 0.3983262056052844,
"acc_stderr": 0.009142329658293176
},
"harness|drop|3": {
"em": 0.014681208053691275,
"em_stderr": 0.0012317113143108561,
"f1": 0.07373846476510039,
"f1_stderr": 0.0018229608118759215
},
"harness|gsm8k|5": {
"acc": 0.05079605761940864,
"acc_stderr": 0.006048352096878091
},
"harness|winogrande|5": {
"acc": 0.7458563535911602,
"acc_stderr": 0.012236307219708262
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged
|
[
"region:us"
] |
2023-08-31T19:30:42+00:00
|
{"pretty_name": "Evaluation run of dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged](https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T02:38:18.626473](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged/blob/main/results_2023-10-23T02-38-18.626473.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.014681208053691275,\n \"em_stderr\": 0.0012317113143108561,\n \"f1\": 0.07373846476510039,\n \"f1_stderr\": 0.0018229608118759215,\n \"acc\": 0.3983262056052844,\n \"acc_stderr\": 0.009142329658293176\n },\n \"harness|drop|3\": {\n \"em\": 0.014681208053691275,\n \"em_stderr\": 0.0012317113143108561,\n \"f1\": 0.07373846476510039,\n \"f1_stderr\": 0.0018229608118759215\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05079605761940864,\n \"acc_stderr\": 0.006048352096878091\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7458563535911602,\n \"acc_stderr\": 0.012236307219708262\n }\n}\n```", "repo_url": "https://huggingface.co/dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|arc:challenge|25_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|arc:challenge|25_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T02_38_18.626473", "path": ["**/details_harness|drop|3_2023-10-23T02-38-18.626473.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T02-38-18.626473.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T02_38_18.626473", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-38-18.626473.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-38-18.626473.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hellaswag|10_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hellaswag|10_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T20:30:17.516134.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T20:49:05.320050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T20:49:05.320050.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T20:49:05.320050.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T02_38_18.626473", "path": ["**/details_harness|winogrande|5_2023-10-23T02-38-18.626473.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T02-38-18.626473.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T20_30_17.516134", "path": ["results_2023-08-31T20:30:17.516134.parquet"]}, {"split": "2023_08_31T20_49_05.320050", "path": ["results_2023-08-31T20:49:05.320050.parquet"]}, {"split": "2023_10_23T02_38_18.626473", "path": ["results_2023-10-23T02-38-18.626473.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T02-38-18.626473.parquet"]}]}]}
|
2023-10-23T01:38:32+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T02:38:18.626473(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T02:38:18.626473(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T02:38:18.626473(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
40,
31,
188,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dhmeltzer/llama-7b-SFT_ds_eli5_1024_r_64_alpha_16_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T02:38:18.626473(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
279db727e80de36743a527d23adf67484a7942ba
|
# Dataset Card for "LaMini-LM-raw-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
sachith-surge/LaMini-LM-raw-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML
|
[
"region:us"
] |
2023-08-31T19:32:35+00:00
|
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 401407.7971614429, "num_examples": 1505}], "download_size": 212603, "dataset_size": 401407.7971614429}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-08-31T19:32:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "LaMini-LM-raw-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML"
More Information needed
|
[
"# Dataset Card for \"LaMini-LM-raw-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"LaMini-LM-raw-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
[
6,
44
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"LaMini-LM-raw-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
512f7b81e8b5251eb57334ec022ca889580fc836
|
---
license: cc-by-4.0
task_categories:
- text-classification
language:
- tr
tags:
- food
- Generated Review
size_categories:
- 1K<n<10K
|
Gokce/Generated_Restaurant_Reviews_GPT3.5
|
[
"region:us"
] |
2023-08-31T19:37:20+00:00
|
{"pretty_name": "AI_Restaurant_Reviews"}
|
2023-08-31T20:12:36+00:00
|
[] |
[] |
TAGS
#region-us
|
---
license: cc-by-4.0
task_categories:
- text-classification
language:
- tr
tags:
- food
- Generated Review
size_categories:
- 1K<n<10K
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
f655fab57239644e6db80782ea163d866d57a316
|
# Dataset Card for Evaluation run of TheBloke/fiction.live-Kimiko-V2-70B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/fiction.live-Kimiko-V2-70B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/fiction.live-Kimiko-V2-70B-fp16](https://huggingface.co/TheBloke/fiction.live-Kimiko-V2-70B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__fiction.live-Kimiko-V2-70B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T10:02:44.747886](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__fiction.live-Kimiko-V2-70B-fp16/blob/main/results_2023-10-23T10-02-44.747886.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177344,
"f1": 0.06689072986577178,
"f1_stderr": 0.0013705945295387344,
"acc": 0.5923530956998468,
"acc_stderr": 0.011715067911613648
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177344,
"f1": 0.06689072986577178,
"f1_stderr": 0.0013705945295387344
},
"harness|gsm8k|5": {
"acc": 0.3457164518574678,
"acc_stderr": 0.01310042299044158
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785717
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__fiction.live-Kimiko-V2-70B-fp16
|
[
"region:us"
] |
2023-08-31T19:41:49+00:00
|
{"pretty_name": "Evaluation run of TheBloke/fiction.live-Kimiko-V2-70B-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/fiction.live-Kimiko-V2-70B-fp16](https://huggingface.co/TheBloke/fiction.live-Kimiko-V2-70B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__fiction.live-Kimiko-V2-70B-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T10:02:44.747886](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__fiction.live-Kimiko-V2-70B-fp16/blob/main/results_2023-10-23T10-02-44.747886.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177344,\n \"f1\": 0.06689072986577178,\n \"f1_stderr\": 0.0013705945295387344,\n \"acc\": 0.5923530956998468,\n \"acc_stderr\": 0.011715067911613648\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177344,\n \"f1\": 0.06689072986577178,\n \"f1_stderr\": 0.0013705945295387344\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3457164518574678,\n \"acc_stderr\": 0.01310042299044158\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785717\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/fiction.live-Kimiko-V2-70B-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|arc:challenge|25_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T10_02_44.747886", "path": ["**/details_harness|drop|3_2023-10-23T10-02-44.747886.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T10-02-44.747886.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T10_02_44.747886", "path": ["**/details_harness|gsm8k|5_2023-10-23T10-02-44.747886.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T10-02-44.747886.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hellaswag|10_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T20:41:25.940897.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T20:41:25.940897.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T20:41:25.940897.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T10_02_44.747886", "path": ["**/details_harness|winogrande|5_2023-10-23T10-02-44.747886.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T10-02-44.747886.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T20_41_25.940897", "path": ["results_2023-08-31T20:41:25.940897.parquet"]}, {"split": "2023_10_23T10_02_44.747886", "path": ["results_2023-10-23T10-02-44.747886.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T10-02-44.747886.parquet"]}]}]}
|
2023-10-23T09:02:58+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/URL-Kimiko-V2-70B-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/URL-Kimiko-V2-70B-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T10:02:44.747886(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/URL-Kimiko-V2-70B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/URL-Kimiko-V2-70B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T10:02:44.747886(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/URL-Kimiko-V2-70B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/URL-Kimiko-V2-70B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T10:02:44.747886(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
28,
31,
176,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/URL-Kimiko-V2-70B-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/URL-Kimiko-V2-70B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T10:02:44.747886(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
23ffeb5454bf0df63a4d236a29393a64770d6eae
|
# Dataset Card for "llama_2-optimized-titles-esci-sft-all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
qazi-ali/llama_2-optimized-titles-esci-sft-all
|
[
"region:us"
] |
2023-08-31T19:46:41+00:00
|
{"dataset_info": {"features": [{"name": "product_title", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "preds", "dtype": "string"}, {"name": "clean_preds", "dtype": "string"}, {"name": "average_score", "dtype": "float64"}, {"name": "new_score", "dtype": "float64"}, {"name": "good_pred", "dtype": "string"}, {"name": "bad_pred", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}, {"name": "index", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 11005402.0, "num_examples": 7995}], "download_size": 6024391, "dataset_size": 11005402.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-08-31T19:46:43+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "llama_2-optimized-titles-esci-sft-all"
More Information needed
|
[
"# Dataset Card for \"llama_2-optimized-titles-esci-sft-all\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"llama_2-optimized-titles-esci-sft-all\"\n\nMore Information needed"
] |
[
6,
27
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"llama_2-optimized-titles-esci-sft-all\"\n\nMore Information needed"
] |
473d9aeb68581899682ce486595945cdeeb68414
|
# Dataset Card for "llama_2-optimized-titles-esci-sft-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
qazi-ali/llama_2-optimized-titles-esci-sft-test
|
[
"region:us"
] |
2023-08-31T19:46:43+00:00
|
{"dataset_info": {"features": [{"name": "index", "dtype": "int64"}, {"name": "product_title", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "preds", "dtype": "string"}, {"name": "clean_preds", "dtype": "string"}, {"name": "average_score", "dtype": "float64"}, {"name": "new_score", "dtype": "float64"}, {"name": "good_pred", "dtype": "string"}, {"name": "bad_pred", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 3059731.0, "num_examples": 2321}], "download_size": 1697427, "dataset_size": 3059731.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-08-31T19:46:44+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "llama_2-optimized-titles-esci-sft-test"
More Information needed
|
[
"# Dataset Card for \"llama_2-optimized-titles-esci-sft-test\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"llama_2-optimized-titles-esci-sft-test\"\n\nMore Information needed"
] |
[
6,
27
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"llama_2-optimized-titles-esci-sft-test\"\n\nMore Information needed"
] |
ec1bfd29870c31390658b2a6428e8c5cd5d248f8
|
# Dataset Card for "HC3_ru_8k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/HC3_ru_8k
|
[
"region:us"
] |
2023-08-31T19:47:27+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "human_answers", "sequence": "string"}, {"name": "chatgpt_answers", "sequence": "string"}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 44537809.06175479, "num_examples": 8000}], "download_size": 21121279, "dataset_size": 44537809.06175479}}
|
2023-08-31T19:48:03+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "HC3_ru_8k"
More Information needed
|
[
"# Dataset Card for \"HC3_ru_8k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"HC3_ru_8k\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"HC3_ru_8k\"\n\nMore Information needed"
] |
78e2c0e7f65488446a34757ee2f651d90f904ed8
|
# Dataset Card for "ru_word_games_3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/ru_word_games_3k
|
[
"region:us"
] |
2023-08-31T19:50:39+00:00
|
{"dataset_info": {"features": [{"name": "subset", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "prompt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 295312.9940025371, "num_examples": 3000}], "download_size": 150082, "dataset_size": 295312.9940025371}}
|
2023-08-31T19:50:57+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "ru_word_games_3k"
More Information needed
|
[
"# Dataset Card for \"ru_word_games_3k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"ru_word_games_3k\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"ru_word_games_3k\"\n\nMore Information needed"
] |
120a30c2efcd7097426e66fd5dc338207789a07e
|
# Dataset Card for "gua-llama-ofan"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
FarAwayFer/gua-llama-ofan
|
[
"region:us"
] |
2023-08-31T19:57:24+00:00
|
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1656684, "num_examples": 1008}], "download_size": 970097, "dataset_size": 1656684}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-08-31T20:02:42+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "gua-llama-ofan"
More Information needed
|
[
"# Dataset Card for \"gua-llama-ofan\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"gua-llama-ofan\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"gua-llama-ofan\"\n\nMore Information needed"
] |
019a871cd98430fe88baa7bec1435320a7aa78a2
|
# Dataset Card for "autotree_pmlb_twonorm_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
yzhuang/autotree_pmlb_twonorm_sgosdt_l256_d3_sd0
|
[
"region:us"
] |
2023-08-31T20:04:20+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "input_x", "sequence": {"sequence": "float32"}}, {"name": "input_y", "sequence": {"sequence": "float32"}}, {"name": "rtg", "sequence": "float64"}, {"name": "status", "sequence": {"sequence": "float32"}}, {"name": "split_threshold", "sequence": {"sequence": "float32"}}, {"name": "split_dimension", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 308080000, "num_examples": 10000}, {"name": "validation", "num_bytes": 308080000, "num_examples": 10000}], "download_size": 242554163, "dataset_size": 616160000}}
|
2023-08-31T20:04:30+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "autotree_pmlb_twonorm_sgosdt_l256_d3_sd0"
More Information needed
|
[
"# Dataset Card for \"autotree_pmlb_twonorm_sgosdt_l256_d3_sd0\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"autotree_pmlb_twonorm_sgosdt_l256_d3_sd0\"\n\nMore Information needed"
] |
[
6,
32
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"autotree_pmlb_twonorm_sgosdt_l256_d3_sd0\"\n\nMore Information needed"
] |
5e3ca169fb69b7b6f6bfe97bd48f716f9c742bc1
|
# Dataset Card for "malware-top-100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
PurCL/malware-top-100
|
[
"region:us"
] |
2023-08-31T20:09:21+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "binary_name", "dtype": "string"}, {"name": "labels", "sequence": "string"}, {"name": "functions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5667834326.115244, "num_examples": 3728}, {"name": "test", "num_bytes": 1667814982.765135, "num_examples": 1097}, {"name": "valid", "num_bytes": 1001905263.1196207, "num_examples": 659}], "download_size": 2454551882, "dataset_size": 8337554571.999999}}
|
2023-08-31T20:13:38+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "malware-top-100"
More Information needed
|
[
"# Dataset Card for \"malware-top-100\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"malware-top-100\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"malware-top-100\"\n\nMore Information needed"
] |
3dd8fd9ce26754d30c6471223d67b2d77155d9f6
|
# Dataset Card for "malware-top-100-labels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
PurCL/malware-top-100-labels
|
[
"region:us"
] |
2023-08-31T20:13:38+00:00
|
{"dataset_info": {"features": [{"name": "l", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1045, "num_examples": 100}], "download_size": 1723, "dataset_size": 1045}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-08-31T20:13:39+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "malware-top-100-labels"
More Information needed
|
[
"# Dataset Card for \"malware-top-100-labels\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"malware-top-100-labels\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"malware-top-100-labels\"\n\nMore Information needed"
] |
3d1463dfce9988b7ec35e67618cdf2c137b74782
|
# Dataset Card for "dila_legifrance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
manu/dila_legifrance
|
[
"region:us"
] |
2023-08-31T20:15:31+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4399589467, "num_examples": 2349748}], "download_size": 1326748165, "dataset_size": 4399589467}}
|
2023-08-31T20:22:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "dila_legifrance"
More Information needed
|
[
"# Dataset Card for \"dila_legifrance\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"dila_legifrance\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"dila_legifrance\"\n\nMore Information needed"
] |
d1c6f5d212cfa8af1d464ecc1e1b005d634a81e8
|
# Dataset Card for "cc100_dedup"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
eduagarcia-temp/cc100_dedup
|
[
"region:us"
] |
2023-08-31T20:25:50+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 52750544585, "num_examples": 38059979}], "download_size": 33900809688, "dataset_size": 52750544585}}
|
2023-08-31T23:43:51+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "cc100_dedup"
More Information needed
|
[
"# Dataset Card for \"cc100_dedup\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"cc100_dedup\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"cc100_dedup\"\n\nMore Information needed"
] |
54d9236b4bb7e0d30413cc3b441478edcde06aa1
|
# Dataset Card for "runne_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/runne_prompts
|
[
"region:us"
] |
2023-08-31T20:35:34+00:00
|
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "parsed_entities", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2636744, "num_examples": 537}], "download_size": 1142735, "dataset_size": 2636744}}
|
2023-09-02T15:20:49+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "runne_prompts"
More Information needed
|
[
"# Dataset Card for \"runne_prompts\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"runne_prompts\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"runne_prompts\"\n\nMore Information needed"
] |
8617d223c5c87eca74421b0878ccb163d503a084
|
# Dataset Card for Evaluation run of Sao10K/Stheno-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Stheno-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Stheno-L2-13B](https://huggingface.co/Sao10K/Stheno-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T19:58:15.473819](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-L2-13B/blob/main/results_2023-09-17T19-58-15.473819.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2925755033557047,
"em_stderr": 0.004659064029280355,
"f1": 0.35764366610738435,
"f1_stderr": 0.004568345368095279,
"acc": 0.43558446671888545,
"acc_stderr": 0.010545764058478083
},
"harness|drop|3": {
"em": 0.2925755033557047,
"em_stderr": 0.004659064029280355,
"f1": 0.35764366610738435,
"f1_stderr": 0.004568345368095279
},
"harness|gsm8k|5": {
"acc": 0.1197877179681577,
"acc_stderr": 0.008944213403553058
},
"harness|winogrande|5": {
"acc": 0.7513812154696132,
"acc_stderr": 0.012147314713403105
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Sao10K__Stheno-L2-13B
|
[
"region:us"
] |
2023-08-31T21:32:35+00:00
|
{"pretty_name": "Evaluation run of Sao10K/Stheno-L2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sao10K/Stheno-L2-13B](https://huggingface.co/Sao10K/Stheno-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-L2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T19:58:15.473819](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-L2-13B/blob/main/results_2023-09-17T19-58-15.473819.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2925755033557047,\n \"em_stderr\": 0.004659064029280355,\n \"f1\": 0.35764366610738435,\n \"f1_stderr\": 0.004568345368095279,\n \"acc\": 0.43558446671888545,\n \"acc_stderr\": 0.010545764058478083\n },\n \"harness|drop|3\": {\n \"em\": 0.2925755033557047,\n \"em_stderr\": 0.004659064029280355,\n \"f1\": 0.35764366610738435,\n \"f1_stderr\": 0.004568345368095279\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1197877179681577,\n \"acc_stderr\": 0.008944213403553058\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403105\n }\n}\n```", "repo_url": "https://huggingface.co/Sao10K/Stheno-L2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|arc:challenge|25_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T19_58_15.473819", "path": ["**/details_harness|drop|3_2023-09-17T19-58-15.473819.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T19-58-15.473819.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T19_58_15.473819", "path": ["**/details_harness|gsm8k|5_2023-09-17T19-58-15.473819.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T19-58-15.473819.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hellaswag|10_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T22:32:10.395838.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T22:32:10.395838.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T22:32:10.395838.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T19_58_15.473819", "path": ["**/details_harness|winogrande|5_2023-09-17T19-58-15.473819.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T19-58-15.473819.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T22_32_10.395838", "path": ["results_2023-08-31T22:32:10.395838.parquet"]}, {"split": "2023_09_17T19_58_15.473819", "path": ["results_2023-09-17T19-58-15.473819.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T19-58-15.473819.parquet"]}]}]}
|
2023-09-17T18:58:27+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Sao10K/Stheno-L2-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Sao10K/Stheno-L2-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T19:58:15.473819(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Sao10K/Stheno-L2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T19:58:15.473819(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Sao10K/Stheno-L2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T19:58:15.473819(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Sao10K/Stheno-L2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T19:58:15.473819(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e955b51aef8107f99f6ae37e7ca77bc1e685bff6
|
# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Stheno-Inverted-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Stheno-Inverted-L2-13B](https://huggingface.co/Sao10K/Stheno-Inverted-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-Inverted-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T14:49:52.594706](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Inverted-L2-13B/blob/main/results_2023-10-24T14-49-52.594706.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.005453020134228188,
"em_stderr": 0.0007541727796792602,
"f1": 0.08334836409396004,
"f1_stderr": 0.00173175395556551,
"acc": 0.43967650267207525,
"acc_stderr": 0.01076620685162581
},
"harness|drop|3": {
"em": 0.005453020134228188,
"em_stderr": 0.0007541727796792602,
"f1": 0.08334836409396004,
"f1_stderr": 0.00173175395556551
},
"harness|gsm8k|5": {
"acc": 0.13191811978771797,
"acc_stderr": 0.009321265253857515
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Sao10K__Stheno-Inverted-L2-13B
|
[
"region:us"
] |
2023-08-31T21:34:49+00:00
|
{"pretty_name": "Evaluation run of Sao10K/Stheno-Inverted-L2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sao10K/Stheno-Inverted-L2-13B](https://huggingface.co/Sao10K/Stheno-Inverted-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-Inverted-L2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T14:49:52.594706](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Inverted-L2-13B/blob/main/results_2023-10-24T14-49-52.594706.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005453020134228188,\n \"em_stderr\": 0.0007541727796792602,\n \"f1\": 0.08334836409396004,\n \"f1_stderr\": 0.00173175395556551,\n \"acc\": 0.43967650267207525,\n \"acc_stderr\": 0.01076620685162581\n },\n \"harness|drop|3\": {\n \"em\": 0.005453020134228188,\n \"em_stderr\": 0.0007541727796792602,\n \"f1\": 0.08334836409396004,\n \"f1_stderr\": 0.00173175395556551\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13191811978771797,\n \"acc_stderr\": 0.009321265253857515\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n }\n}\n```", "repo_url": "https://huggingface.co/Sao10K/Stheno-Inverted-L2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|arc:challenge|25_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T14_49_52.594706", "path": ["**/details_harness|drop|3_2023-10-24T14-49-52.594706.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T14-49-52.594706.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T14_49_52.594706", "path": ["**/details_harness|gsm8k|5_2023-10-24T14-49-52.594706.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T14-49-52.594706.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hellaswag|10_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T22:34:24.452875.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T22:34:24.452875.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T22:34:24.452875.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T14_49_52.594706", "path": ["**/details_harness|winogrande|5_2023-10-24T14-49-52.594706.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T14-49-52.594706.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T22_34_24.452875", "path": ["results_2023-08-31T22:34:24.452875.parquet"]}, {"split": "2023_10_24T14_49_52.594706", "path": ["results_2023-10-24T14-49-52.594706.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T14-49-52.594706.parquet"]}]}]}
|
2023-10-24T13:50:06+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-L2-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Sao10K/Stheno-Inverted-L2-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-24T14:49:52.594706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-L2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-Inverted-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T14:49:52.594706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-L2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-Inverted-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T14:49:52.594706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-L2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-Inverted-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T14:49:52.594706(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d594574f5cfaa02fdfb74e648ce80898828745b6
|
# Dataset Card for "horoscopes_ru_1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dim/horoscopes_ru_1k
|
[
"region:us"
] |
2023-08-31T21:51:48+00:00
|
{"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prediction", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 952167, "num_examples": 1000}], "download_size": 462523, "dataset_size": 952167}}
|
2023-08-31T21:51:58+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "horoscopes_ru_1k"
More Information needed
|
[
"# Dataset Card for \"horoscopes_ru_1k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"horoscopes_ru_1k\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"horoscopes_ru_1k\"\n\nMore Information needed"
] |
c7e89f806e077253f9116ddf817cbc25e081954a
|
# Dataset Card for Evaluation run of lgaalves/gpt2_guanaco-dolly-platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/gpt2_guanaco-dolly-platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/gpt2_guanaco-dolly-platypus](https://huggingface.co/lgaalves/gpt2_guanaco-dolly-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__gpt2_guanaco-dolly-platypus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T17:11:56.219131](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_guanaco-dolly-platypus/blob/main/results_2023-10-15T17-11-56.219131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965757,
"f1": 0.04961304530201346,
"f1_stderr": 0.001421455981669693,
"acc": 0.2505919494869771,
"acc_stderr": 0.007026223145264506
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965757,
"f1": 0.04961304530201346,
"f1_stderr": 0.001421455981669693
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5011838989739542,
"acc_stderr": 0.014052446290529012
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_lgaalves__gpt2_guanaco-dolly-platypus
|
[
"region:us"
] |
2023-08-31T22:17:20+00:00
|
{"pretty_name": "Evaluation run of lgaalves/gpt2_guanaco-dolly-platypus", "dataset_summary": "Dataset automatically created during the evaluation run of model [lgaalves/gpt2_guanaco-dolly-platypus](https://huggingface.co/lgaalves/gpt2_guanaco-dolly-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__gpt2_guanaco-dolly-platypus\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T17:11:56.219131](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__gpt2_guanaco-dolly-platypus/blob/main/results_2023-10-15T17-11-56.219131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965757,\n \"f1\": 0.04961304530201346,\n \"f1_stderr\": 0.001421455981669693,\n \"acc\": 0.2505919494869771,\n \"acc_stderr\": 0.007026223145264506\n },\n \"harness|drop|3\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965757,\n \"f1\": 0.04961304530201346,\n \"f1_stderr\": 0.001421455981669693\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5011838989739542,\n \"acc_stderr\": 0.014052446290529012\n }\n}\n```", "repo_url": "https://huggingface.co/lgaalves/gpt2_guanaco-dolly-platypus", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|arc:challenge|25_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T17_11_56.219131", "path": ["**/details_harness|drop|3_2023-10-15T17-11-56.219131.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T17-11-56.219131.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T17_11_56.219131", "path": ["**/details_harness|gsm8k|5_2023-10-15T17-11-56.219131.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T17-11-56.219131.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hellaswag|10_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T23:17:05.227048.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T23:17:05.227048.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T23:17:05.227048.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T17_11_56.219131", "path": ["**/details_harness|winogrande|5_2023-10-15T17-11-56.219131.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T17-11-56.219131.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T23_17_05.227048", "path": ["results_2023-08-31T23:17:05.227048.parquet"]}, {"split": "2023_10_15T17_11_56.219131", "path": ["results_2023-10-15T17-11-56.219131.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T17-11-56.219131.parquet"]}]}]}
|
2023-10-15T16:12:07+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of lgaalves/gpt2_guanaco-dolly-platypus
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model lgaalves/gpt2_guanaco-dolly-platypus on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T17:11:56.219131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of lgaalves/gpt2_guanaco-dolly-platypus",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2_guanaco-dolly-platypus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T17:11:56.219131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of lgaalves/gpt2_guanaco-dolly-platypus",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2_guanaco-dolly-platypus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T17:11:56.219131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
28,
31,
176,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lgaalves/gpt2_guanaco-dolly-platypus## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/gpt2_guanaco-dolly-platypus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T17:11:56.219131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
738f46a0072a5cc87d6b776cb6d856000fdcde1b
|
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
eValuation/data_xpert
|
[
"region:us"
] |
2023-08-31T22:26:49+00:00
|
{}
|
2023-08-31T22:28:30+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Dataset Name
## Dataset Description
- Homepage:
- Repository:
- Paper:
- Leaderboard:
- Point of Contact:
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
8,
24,
32,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
85f44938d951590794ac4f23a53d57b8a0f64d8c
|
# Dataset Card for Evaluation run of Open-Orca/OpenOrca-Platypus2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Open-Orca/OpenOrca-Platypus2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Open-Orca/OpenOrca-Platypus2-13B](https://huggingface.co/Open-Orca/OpenOrca-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Open-Orca__OpenOrca-Platypus2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:55:50.660601](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__OpenOrca-Platypus2-13B/blob/main/results_2023-09-22T19-55-50.660601.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006291946308724832,
"em_stderr": 0.0008097697705635448,
"f1": 0.09632445469798691,
"f1_stderr": 0.0019431536363283344,
"acc": 0.42632540137974945,
"acc_stderr": 0.009926418007126542
},
"harness|drop|3": {
"em": 0.006291946308724832,
"em_stderr": 0.0008097697705635448,
"f1": 0.09632445469798691,
"f1_stderr": 0.0019431536363283344
},
"harness|gsm8k|5": {
"acc": 0.09021986353297953,
"acc_stderr": 0.00789153710844993
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803152
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Open-Orca__OpenOrca-Platypus2-13B
|
[
"region:us"
] |
2023-08-31T22:53:53+00:00
|
{"pretty_name": "Evaluation run of Open-Orca/OpenOrca-Platypus2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Open-Orca/OpenOrca-Platypus2-13B](https://huggingface.co/Open-Orca/OpenOrca-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Open-Orca__OpenOrca-Platypus2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T19:55:50.660601](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__OpenOrca-Platypus2-13B/blob/main/results_2023-09-22T19-55-50.660601.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006291946308724832,\n \"em_stderr\": 0.0008097697705635448,\n \"f1\": 0.09632445469798691,\n \"f1_stderr\": 0.0019431536363283344,\n \"acc\": 0.42632540137974945,\n \"acc_stderr\": 0.009926418007126542\n },\n \"harness|drop|3\": {\n \"em\": 0.006291946308724832,\n \"em_stderr\": 0.0008097697705635448,\n \"f1\": 0.09632445469798691,\n \"f1_stderr\": 0.0019431536363283344\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09021986353297953,\n \"acc_stderr\": 0.00789153710844993\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803152\n }\n}\n```", "repo_url": "https://huggingface.co/Open-Orca/OpenOrca-Platypus2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|arc:challenge|25_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T19_55_50.660601", "path": ["**/details_harness|drop|3_2023-09-22T19-55-50.660601.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T19-55-50.660601.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T19_55_50.660601", "path": ["**/details_harness|gsm8k|5_2023-09-22T19-55-50.660601.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T19-55-50.660601.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hellaswag|10_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T23:53:28.484029.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T23:53:28.484029.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T23:53:28.484029.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T19_55_50.660601", "path": ["**/details_harness|winogrande|5_2023-09-22T19-55-50.660601.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T19-55-50.660601.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_31T23_53_28.484029", "path": ["results_2023-08-31T23:53:28.484029.parquet"]}, {"split": "2023_09_22T19_55_50.660601", "path": ["results_2023-09-22T19-55-50.660601.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T19-55-50.660601.parquet"]}]}]}
|
2023-09-22T18:56:04+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Open-Orca/OpenOrca-Platypus2-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Open-Orca/OpenOrca-Platypus2-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T19:55:50.660601(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Open-Orca/OpenOrca-Platypus2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Open-Orca/OpenOrca-Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T19:55:50.660601(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Open-Orca/OpenOrca-Platypus2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Open-Orca/OpenOrca-Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T19:55:50.660601(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Open-Orca/OpenOrca-Platypus2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Open-Orca/OpenOrca-Platypus2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T19:55:50.660601(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e2d81d919ef9c07ea64510ea3aef8e007add8599
|
# MADLAD-400
## Dataset and Introduction
[MADLAD-400 (*Multilingual Audited Dataset: Low-resource And Document-level*)](https://arxiv.org/abs/2309.04662) is
a document-level multilingual dataset based on Common Crawl, covering 419
languages in total. This uses all snapshots of CommonCrawl available as of August
1, 2022. The primary advantage of this dataset over similar datasets is that it
is more multilingual (419 languages), it is audited and more highly filtered,
and it is document-level. The main disadvantage is also its strength -- being
more filtered, it may lack the recall needed for some applications.
There are two versions released: the **noisy** dataset, which has no filtering
except document-level LangID, and the **clean** dataset, which has a variety of
filters applied, though it naturally has a fair amount of noise itself. Each
dataset is released in a document-level form that has been deduplicated.
## Loading
You can load both the clean and noisy versions of any language by specifing its LangID:
~~~
madlad_abt = load_dataset("allenai/madlad-400", "abt")
~~~
A list of langagues can also be supplied with a keyword argument:
~~~
madlad_multilang = load_dataset("allenai/madlad-400", languages=["abt", "ace"])
~~~
Additionally, you can load the noisy and clean subsets seperately with the split keyword argument:
~~~
madlad_multilang_clean = load_dataset("allenai/madlad-400", languages=["abt", "ace"], split="clean")
~~~
## LangID model and Crawl
Following [Language Id In the Wild](https://arxiv.org/pdf/2010.14571.pdf), we
trained a Semi-Supervised LangId model (SSLID) on 500 languages. The training
data is as described in that paper, with the differences that 1) training data
is sampled to a temperature of `T=3` to reduce over-triggering on low-resource
languages; and 2) the data is supplemented with web-crawled data from the same
paper (that has already been through the various filters described therein) in
the hopes that it will increase robustness to web-domain text.
## Filtering
Before separating the raw CommonCrawl corpus by LangID, these
filtering steps are done, similar to Raffel et al (2020):
- Discarded any page with fewer than 5 sentences and only retained lines that
contained at least 3 words.
- Removed any line with the word Javascript.
- Removed any page where the phrase “lorem ipsum” appeared.
- Removed any pages containing the phrases "terms of use", "privacy policy",
"cookie policy", "uses cookies", "use of cookies", "use cookies"
- Removed any pages that contained a curly bracket.
- To deduplicate the data set, discarded all but one of any three-sentence span occurring more than once in the data set.
The `noisy` subset of the data was filtered only by document-level LangID, which
was taken to be the majority sentence-level LangID prediction. The `clean`
subset removed all documents with a `percent_questionable` score greater than
20%. It furthermore removed any document with under 5 sentences.
The `pct_questionable` score is simple the percentage of sentences in the input
document that were "questionable". A sentence was considered questionable if any
of the following were true:
* **LangID Consistency:** the sentence-level LangID does not match the
document-level LangID
* **List Case:** The sentence has at least 12 tokens, and over 50% percent of
the tokens began in a capital letter.
* **Length:** The sentence has under 20 characters or over 500 characters
(note: this is a bad heuristic for ideographic languages)
* **Danger Chars:** Over 20% of the characters in the sentence match
`[0-9{}+/()>]`
* **Cursedness:** The sentence matches a cursed regex (see below)
### Cursed Substrings
Based on the initial round of data audits, the authors created a heuristic list of
substrings and regexes accounting for a large amount of questionable content.
Keep in mind that these all are fed into the `pct_questionable` score -- a
sentence is only excluded from the `clean` dataset if over 20% of the sentences
in that document are flagged as questionable.
notes about cursed substrings:
* low quality sentences ending in the pipe character were very common. Before
you ask, this was not Devanagari-script text using a Danda.
* The last few regexes are meant to match `A N T S P E A K`, `List Case`, and
weirdly regular text (for instance, lists of shipping labels or country
codes)
```
# this implementation is for demonstration and is pretty inefficient;
# to speed it up, use string inclusion (`in`) instead of regex for all but the
# last four, and for those use a compiled regex.
def is_cursed(s):
return any(re.findall(curse, s) in s for curse in CURSED_SUBSTRINGS)
CURSED_SUBSTRINGS = [" №", "���", "\\|\\s*$", " nr\\.$", "aute irure dolor ", " sunt in culpa qui ", "orem ipsum ", " quis nostrud ", " adipisicing ", " dolore eu ", " cupidatat ", "autem vel eum", "wisi enim ad", " sex ", " porn ", "黄色电影", "mp3", "ownload", "Vol\\.", " Ep\\.", "Episode", " г\\.\\s*$", " кг\\.\\s*$", " шт\\.", "Develop", "Facebook", " crusher ", " xxx ", " ... ... ... ... ... ... ... ... ...", " .... .... .... .... .... .... .... .... ....", " [^ ] [^ ] [^ ] [^ ] [^ ] [^ ] [^ ] [^ ] [^ ]", ", ..,,? ..,,? ..,,? ..,,?"]
```
### Virama Correction
Many languages using Brahmic Abugida (South and Southeast Asian scripts like
Devanagari, Khmer, etc.) use some variant on the virama character. For whatever
reason, it was found that this character was often messed up in the common crawl
snapshots used. Therefore, for the languages `bn my pa gu or ta te kn ml
si th tl mn lo bo km hi mr ne gom as jv dv bho dz hne ks_Deva mag mni shn yue zh
ja kjg mnw ksw rki mtr mwr xnr`, a special correction step was done.
For these languages, the authors took the list of all virama characters and removed all
unnecessary spaces between each instance of a virama character and the next
character with a regex.
```
'%s' % regex.sub(r' ([%s]) ' % _VIRAMA_CHARS, '\\1', x)
```
### Myanmar Font Compatibility
Prior to 2019, the most popular font for Burmese websites was the Zawgyi font.
The authors used [Myanmar Tools](https://github.com/google/myanmar-tools) to convert text.
Several scripts, like the Chinese script, Tibetan script, and Thai, do not use
whitespace to separate characters. The languages with this property in this
dataset are `yue zh ja th lo kjg mnw my shn ksw rki km bo dz`.
Alas, the **Length** aspect of the `pct_questionable` score was calculated using
simplistic whitespace tokenization, and therefore rendered the whole
`pct_questionable` score invalid for those languages. Therefore, for these
languages, the "clean" data is identical to the "noisy" data (barring Chinese;
see below.)
### Special filters
Chinese had a particular issue with pornographic content. After manual inspection
a list of strings likely to be present in pornographic content was developed. All
pages containing at least one of these strings were removed. Resulted in 17%
reduction in number of documents and 56% reduction in file size.
```
pornsignals = "caoporn caoprom caopron caoporen caoponrn caoponav caopom caoorn 99re dy888 caopro hezyo re99 4438x zooskool xfplay 7tav xxoo xoxo 52av freexx 91chinese anquye cao97 538porm 87fuli 91pron 91porn 26uuu 4438x 182tv kk4444 777me ae86 91av 720lu yy6080 6080yy qqchub paa97 aiai777 yy4480 videossexo 91free 一级特黄大片 偷拍久久国产视频 日本毛片免费视频观看 久久免费热在线精品 高清毛片在线看 日本毛片高清免费视频 一级黄色录像影片 亚洲男人天堂 久久精品视频在线看 自拍区偷拍亚洲视频 亚洲人成视频在线播放 色姑娘综合站 丁香五月啪啪 在线视频成人社区 亚洲人成视频在线播放 久久国产自偷拍 一本道 大香蕉无码 香港经典三级 亚洲成在人线免费视频 天天色综合网 大香蕉伊人久草 欧美一级高清片 天天鲁夜夜啪视频在线 免费黄片视频在线观看 加比勒久久综合 久草热久草在线视频 韩国三级片大全在线观看 青青草在线视频 美国一级毛片 久草在线福利资源 啪啪啪视频在线观看免费 成人福利视频在线观看 婷婷我去也 老司机在线国产 久久成人视频 手机看片福利永久国产 高清国产偷拍在线 大香蕉在线影院 日本高清免费一本视频 男人的天堂东京热 影音先锋男人资源 五月婷婷开心中文字幕 亚洲香蕉视频在线播放 天天啪久久爱视频精品 超碰久久人人摸人人搞".split()
```
A few more random notes, comparing to common alternative codes for these
languages:
* `fil` for Filipino/Tagalog, not `tl`
* `ak` for Twi/Akan, rather than `tw`. This includes Fante.
* Unfortunately use the macro code `chm` for Meadow Mari (instead of the
correct `mhr`), and `mrj` for Hill Mari
* `no` for Norwegian Bokmål, whereas some resources use
`nb`
* `ps` for Pashto instead of `pbt` (Southern Pashto)
* `ms` for Standard Malay, not `zlm`
* `sq` for Albanian, and don't distinguish dialects like
Gheg (`aln`) and Tosk (`als`)
* `ber` as the code for Tamazight, after consultation with Tamazight
speakers opining that the dialect distinctions are not significant. Other
resources use the individual codes like `tzm` and `kab`.
* Macrocode `qu` for Quechua. In practice, this seems usually to be
a mix of the Ayacucho and Cusco dialects. Other resources, like NLLB, may
use the dialect code, e.g. `quy` for Ayacucho Chanka. The same is true for a
few other macro codes, like `ff` (Macro code for Fulfulde, whereas other
sources may use e.g. `fuv`.)
* Really, there are notes that can be made about almost any code, from the
well-accepted conventions like `zh` for Mandarin, to many dialectical notes,
like which variant of Hmong really is the `hmn` data? But the above ones are
made specifically for ones where the authors are aware of other datasources floating
out there that use different conventions.
## Audit
Following [Quality at a Glance](https://arxiv.org/abs/2103.12028), the authors performed
an "audit" of every corpus in this dataset. Although the authors did not speak most
languages, they were able to give high-level comments on the general quality. They
looked at a sample of 20 documents of each language.
After an initial round of auditing, they devised a new set of filters and applied
them. They then re-did all audits.
### Overall notes from the audit
The decision was to **include languages that looked noisy, but omit any language
that was clearly majority noise, or only had 20 or fewer docs.** This is a low
bar -- twenty documents can be very little indeed, and some of the corpora released are quite noisy, but all of them should have at least the potential to
be used in some useful way. The motivation for not releasing nonsense or tiny
datasets is to not give a false sense of how multilingual this dataset actually
is ("Representation washing"), as recommended by **Quality at a Glance**.
A few overarching points:
* Many low-resource languages only had Bible text, or in some cases jw.org
data. These are marked in the rows below. Generally `ok bible` means that
100% of the audited sentences were Biblical, whereas if `bible` is simply
mentioned in the note, it was not the only source of data.
* Indian languages in the Latin script had a high concentration of
pornographic content.
### Renames and Merges as a result of the Audit
In several cases, it was clear from the audit that the corpora were not in the
languages that the LangID model claimed they were. This led to the following
renames:
* dty renamed to `zxx-xx-dtynoise`, aka a "language" of noise. This is mainly
mis-rendered PDFs and may have some practical applications for decoding
said.
* `fan` renamed to `bum`
* `ss-SZ` renamed to `ss` -- this was just a result of us having inconsistent
data labels.
* `cjk` merged into the `gil` dataset
* `bjj` merged into the `awa` dataset
## Canaries
Canaries are provided in separate `canaries` folder. Canaries are organized into three directions: `monolingual` hosts canaries designed for the MADLAD-400 monody data, `multiway` for the multiway data, and `generic` the generic canaries generated only from the model's vocabulary.
* Monolingual: Canaries here are organized by the language the canary was generated from. This corresponds exactly to the `translate_copy` setting in the paper, where the source and target language match.
* Multiway: Canaries here are organized in one of two fashions. `to_XX` indicates canaries organized by the target language (and where the source language could be any language). `XX-XX` indicates the canaries (interleaved_both and interleaved_mislabeled_both) designed for a specific pair of languages.
Within each subdirectory above, canaries are into separate files named by the canary type. There is always only a single file for each canary type. The `generic` folder contains within it the four canary types.
Canaries can be mixed in with normal training data to then be analyzed post-hoc to training
## References
Raffel, Colin, et al. "Exploring the limits of transfer learning with a unified
text-to-text transformer." J. Mach. Learn. Res. 21.140 (2020): 1-67.
## Contact
Please reach out to {snehakudugunta, icaswell}꩜google.com. For questions about the canaries, reach out to [email protected]
## License
This data is released with the `CC-BY-4.0` license.
## Detailed notes from the audit
Here are the notes on all languages, along with the number of documents
found, and the final decision made with respect to including the language in
this dataset.
| Lang. | note | N | decision |
| --------------- | ------------------------ | ---------- | --------------- |
| en | ok | 1838712272 | keep |
| ru | ok | 402458746 | keep |
| es | good | 250906994 | keep |
| de | ok | 225111495 | keep |
| fr | ok | 218863911 | keep |
| it | ok | 126406256 | keep |
| pt | ok | 124207090 | keep |
| pl | ok | 90908786 | keep |
| nl | ok | 86594116 | keep |
| tr | ok | 56417359 | keep |
| vi | ok | 54988654 | keep |
| cs | ok | 38254671 | keep |
| id | ok | 37979244 | keep |
| ro | ok | 35397563 | keep |
| sv | ok. Also the last | 35153050 | keep |
: : language (suz) is "ok : : :
: : bible" : : :
| hu | ok | 29677075 | keep |
| uk | ok | 24968305 | keep |
| fa | idk ask a farsi speaker; | 23138888 | keep |
: : ALI\: OK : : :
| ja | ok a little en mixed in | 21818123 | keep |
| el | ok | 20932239 | keep |
| fi | ok | 20433664 | keep |
| da | ok | 17865888 | keep |
| th | ok | 17439979 | keep |
| no | ok | 14864710 | keep |
| bg | ok | 12755329 | keep |
| ko | ok | 12653878 | keep |
| ar | good | 12411641 | keep |
| sk | ok | 11857945 | keep |
| ca | ok | 9477390 | keep |
| lt | ok | 8748025 | keep |
| iw | ok | 7194574 | keep |
| sl | ok | 6310419 | keep |
| et | ok | 5542933 | keep |
| lv | ok | 5007982 | keep |
| hi | ok some porn | 4512205 | keep |
| sq | good | 3622957 | keep |
| az | good | 3256331 | keep |
| hr | ok | 2841400 | keep |
| ta | ok | 2594191 | keep |
| ms | ok | 2337672 | keep |
| ml | ok | 2072605 | keep |
| sr | ok | 2010607 | keep |
| kk | ok | 1810963 | keep |
| te | ok a lot of weirdly low | 1682441 | keep |
: : quality looking content : : :
: : like commerce : : :
| mr | ok fix virama | 1673848 | keep |
| is | ok | 1560913 | keep |
| bs | good | 1362582 | keep |
| mk | ok | 1358293 | keep |
| gl | ok | 1253170 | keep |
| eu | ok | 1155671 | keep |
| bn | ok | 1138848 | keep |
| be | ok | 1092785 | keep |
| ka | ok | 936497 | keep |
| fil | ok more bible than | 901507 | keep |
: : expected for such a : : :
: : major language : : :
| mn | ok mongolian cyrillic | 879878 | keep |
| af | good | 868671 | keep |
| uz | ok some cyrllic noise | 669909 | keep |
| gu | ok | 659727 | keep |
| kn | ok | 657846 | keep |
| kaa | ok cyrllic | 586361 | keep |
| sw | ok | 537847 | keep |
| ur | ok | 467236 | keep |
| ne | ok | 453349 | keep |
| cy | ok; was terrible before | 430719 | keep |
: : filtering short docs : : :
| hy | ok | 397523 | keep |
| ky | ok | 367577 | keep |
| si | good | 349220 | keep |
| tt | good plus some | 346927 | keep |
: : nonunicode misrendered : : :
: : PDF : : :
| tg | good | 328194 | keep |
| la | ok some broken chars | 319178 | keep |
| so | good | 293218 | keep |
| ga | ok some en noise | 285999 | keep |
| km | ook | 285740 | keep |
| mt | ok | 265388 | keep |
| eo | ok; likely a lot of Mt | 259971 | keep |
| ps | ok | 252888 | keep |
| rw | ok | 226466 | keep |
| ku | ok | 218850 | keep |
| lo | ok many entities in | 215982 | keep |
: : latin script : : :
| fy | ok plausible but i bet | 210025 | keep |
: : there is a lot of nl in : : :
: : there : : :
| ha | ok | 173485 | keep |
| my | filter noise and en fix | 172401 | keep |
: : virama : : :
| dv | good | 167179 | keep |
| pa | ok | 150588 | keep |
| ckb | ok | 148870 | keep |
| lb | ok | 145988 | keep |
| mg | ok some bible jw | 115387 | keep |
| ht | ok | 110443 | keep |
| ug | ok | 106549 | keep |
| am | good | 106301 | keep |
| or | ok | 100530 | keep |
| fo | good | 97754 | keep |
| gd | ok | 94275 | keep |
| ba | ok | 90318 | keep |
| tk | ok; a few weird docs | 82495 | keep |
| mi | ok | 79509 | keep |
| hmn | ok | 75213 | keep |
| grc | ok some bible | 70730 | keep |
| jv | ok | 69473 | keep |
| ceb | ok | 66164 | keep |
| sd | good | 65858 | keep |
| yi | ok | 64949 | keep |
| kaa-Latn | ok urls are .ru or .kz | 61169 | keep |
| sn | ok | 60196 | keep |
| co | ok;l i suspect lots of | 55387 | keep |
: : MT : : :
| su | good | 54968 | keep |
| pap | ok | 54498 | keep |
| ig | ok | 54410 | keep |
| zu | good | 53809 | keep |
| xh | ok | 53672 | keep |
| sm | ok | 52614 | keep |
| ny | ok | 52244 | keep |
| yo | ok | 52067 | keep |
| cv | good | 47318 | keep |
| el-Latn | good; a lot of old | 46428 | keep |
: : content! : : :
| kl | ok | 46027 | keep |
| haw | ok scam tv products | 45670 | keep |
| gsw | wtf is happening here; | 42712 | keep |
: : keep with disclaimer; : : :
: : STILL BOILERPLATE : : :
| tet | good ; actually a lot of | 40367 | keep |
: : fun data! : : :
| st | ok | 40360 | keep |
| lus | ok | 36437 | keep |
| oc | ok | 36379 | keep |
| as | good | 33825 | keep |
| rm | ok | 33805 | keep |
| br | ok after shortfilter | 33219 | keep |
| sah | ok | 29169 | keep |
| hi-Latn | filter porn this is half | 26723 | keep |
: : porn : : :
| se | good | 23872 | keep |
| cnh | good, some local news! | 21556 | keep |
: : not sure if WL : : :
| om | ok | 18895 | keep |
| ce | ok | 14968 | keep |
| udm | ok | 13376 | keep |
| lg | ok lot of | 13030 | keep |
: : www.bukedde.co.ug in : : :
: : this : : :
| os | ok | 12623 | keep |
| nv | ok | 12578 | keep |
| kha | ok | 12070 | keep |
| ilo | ok some bible | 11754 | keep |
| ctd-Latn | ok; from some local | 11629 | keep |
: : news? : : :
| vec | very noisy has wiki from | 11108 | keep |
: : other langs and .it : : :
: : websites so not sure if : : :
: : vec : : :
| hil | ok some en boilerplate | 10564 | keep |
| tyv | ok fun stuff plus some | 9083 | keep |
: : russian noise i think : : :
| iba | ok jw data | 7638 | keep |
| ru-Latn | ok | 7523 | keep |
| kbd | ok many .ru | 7486 | keep |
| ti | ok; poor tigray | 7288 | keep |
| sa | ok | 7117 | keep |
| av | good | 6331 | keep |
| bo | needs some serious | 6226 | keep |
: : script filtering. but : : :
: : there is some ok data in : : :
: : there. : : :
| zza | good | 6019 | keep |
| ber-Latn | ok | 5612 | keep |
| otq | ok | 5554 | keep |
| te-Latn | great good text....but | 5305 | keep |
: : mostly pornographic : : :
| bua | ok | 5264 | keep |
| ts | good | 5198 | keep |
| cfm | ok mostly from | 4858 | keep |
: : chinland.co : : :
| tn | good | 4821 | keep |
| krc | ok | 4815 | keep |
| ak | good; much but not all | 4768 | keep |
: : bible : : :
| meo | ok mostly blogs | 4655 | keep |
| chm | ok; fyi watch out for | 4653 | keep |
: : yandex translationese : : :
| to | good ; news bible | 4612 | keep |
: : government : : :
| ee | good; mostly religious | 4536 | keep |
| nso | ok | 4422 | keep |
| ady | good | 4206 | keep |
| rom | bible | 4187 | keep |
| bho | mostly from anjoria.com. | 4121 | keep |
: : Looks like valid : : :
: : Bhojpuri. : : :
| ltg | ok mostly www.lakuga.lv | 4120 | keep |
| fj | ok | 3976 | keep |
| yua | ok | 3965 | keep |
| gn | ok some broken | 3858 | keep |
: : characters some bible : : :
| az-RU | good; a lot of JW | 3781 | keep |
| ln | ok bible jw | 3325 | keep |
| ada | good; bible; likely | 3095 | keep |
: : mixed with gaa : : :
| myv | maybe has .ru urls | 3095 | keep |
| bik | ok. keep in mind the bik | 3092 | keep |
: : vs bcl issue. : : :
| tlh | ok, but why tf are there | 3054 | keep |
: : websites inklingon? all : : :
: : MT ? : : :
| kbp | not sure if right script | 3036 | keep |
: : wiki says latin : : :
| war | ok but v sus. Pls filter | 2928 | keep |
: : out wikipedia : : :
| wa | ok lots of wiki stuff | 2772 | keep |
| bew | mostly blogs. idk if | 2677 | keep |
: : standard Indonesian or : : :
: : not : : :
| rcf | ok | 2630 | keep |
| ta-Latn | good text .... but | 2580 | keep |
: : pornographic : : :
| kac | ok | 2567 | keep |
| iu | filter script some is en | 2537 | keep |
: : rest is iu script : : :
| ay | good; mix of bible and | 2505 | keep |
: : other news sources : : :
| kum | ok | 2495 | keep |
| qu | ok | 2449 | keep |
| bgp | almost all ur-Latn. | 2427 | keep |
: : consider removing or : : :
: : renaming : : :
| hif | ok some en noise and | 2358 | keep |
: : religious : : :
| kw | ok short boilerplate | 2324 | keep |
: : bible wiki; ok some porn : : :
| nan-Latn-TW | ok | 2285 | keep |
| srn | ok bible + jw | 2281 | keep |
| tly-IR | deeply sus | 2239 | keep |
| sg | ok jw | 2106 | keep |
| gom | ok | 2102 | keep |
| ml-Latn | ok some short docs | 2071 | keep |
| kj | ok | 2062 | keep |
| ksd | ok bible | 2000 | keep |
| dz | ok; hidden parallel | 1899 | keep |
: : text; maybe actually bo; : : :
: : mainly buddhist : : :
| kv | ok a lil boilerplate | 1878 | keep |
: : vibes : : :
| msi | ok | 1870 | keep |
| ve | ok mostly bible jw | 1866 | keep |
| zap | ok JW. | 1803 | keep |
| zxx-xx-dtynoise | BEAUTIFUL NOISE rename | 1765 | keep |
: : but keep as beautiful : : :
: : xample. (was called : : :
: : "dty") : : :
| meu | ok bible | 1728 | keep |
| iso | ok jw | 1721 | keep |
| ium | filter out zh | 1721 | keep |
| nhe | ok | 1714 | keep |
| tyz | ok bible bu again i | 1707 | keep |
: : think some mixeed : : :
: : dialects : : :
| hui | ok some bible | 1680 | keep |
| new | ok | 1634 | keep |
| mdf | ok some short docs | 1609 | keep |
| pag | bible | 1588 | keep |
| gv | filter short repetitive | 1586 | keep |
: : sentences; still same : : :
: : but keep : : :
| gag | has 1-2 cyrillic | 1572 | keep |
: : examples with small amts : : :
: : of arabic script noise : : :
| ngu | ok | 1534 | keep |
| quc | bible | 1526 | keep |
| mam | ok bible jw | 1513 | keep |
| min | ok mostly wiki and bible | 1474 | keep |
| ho | ok | 1466 | keep |
| pon | bible | 1462 | keep |
| mrj | ok | 1447 | keep |
| lu | ok jw | 1444 | keep |
| gom-Latn | ok very noisy ; some ok | 1432 | keep |
: : stuff ; release with : : :
: : disclaimer : : :
| alt | ok | 1422 | keep |
| nzi | ok | 1371 | keep |
| tzo | ok bible + jw | 1357 | keep |
| bci | ok bible | 1329 | keep |
| dtp | ok; mostly from | 1309 | keep |
: : www.newsabahtimes.com.my : : :
| abt | fine; bible | 1305 | keep |
| bbc | ok | 1274 | keep |
| pck | ok | 1255 | keep |
| mai | ok mild amounts of en | 1240 | keep |
: : noise : : :
| mps | ok bible | 1239 | keep |
| emp | ok bible | 1238 | keep |
| mgh | ok bible jw | 1222 | keep |
| tab | idk plausibly ok | 1202 | keep |
| crh | ok | 1184 | keep |
| tbz | good mostly bible but | 1126 | keep |
: : not all : : :
| ss | good mix of data ; | 1089 | keep |
: : renamed from "ss" : : :
| chk | ok bible | 1082 | keep |
| bru | ok; bible | 1072 | keep |
| nnb | ok | 1071 | keep |
| fon | ok mostly jw but not all | 1065 | keep |
| ppk | bible | 1063 | keep |
| tiv | ok jw | 1063 | keep |
| btx | ok probably | 1009 | keep |
| bg-Latn | ok | 991 | keep |
| mbt | ok bible | 969 | keep |
| ace | good; bible | 966 | keep |
| tvl | ok jw | 933 | keep |
| dov | ok bible + jw | 923 | keep |
| ach | good; bible | 915 | keep |
| xal | ok has .ru sites though | 913 | keep |
| cuk | ok bible | 899 | keep |
| kos | ok lds bible | 881 | keep |
| crs | ok | 873 | keep |
| wo | ok; mostly bible. | 871 | keep |
| bts | ok; mostly bible | 869 | keep |
| ubu | ok bible | 846 | keep |
| gym | ok biblle | 820 | keep |
| ibb | ok bible and repeated @ | 818 | keep |
| ape | good; bible | 814 | keep |
| stq | ok i think ? | 809 | keep |
| ang | much noise but some good | 803 | keep |
: : Old English in there! : : :
| enq | ok bible | 793 | keep |
| tsg | much noise but somegood | 789 | keep |
: : data too! : : :
| shn | mostly English | 788 | keep |
: : boilerplate. filter by : : :
: : latin text before : : :
: : releasing : : :
| kri | ok boilerplate noise | 786 | keep |
: : bible jw : : :
| kek | ok jw bible | 782 | keep |
| rmc | ok | 738 | keep |
| acf | good; bible | 730 | keep |
| syr | good; practictitioners | 716 | keep |
: : should keep dialect in : : :
: : mind. : : :
| qub | bible | 705 | keep |
| bm | good | 702 | keep |
| tzh | ok jw | 702 | keep |
| jiv | ok bible | 696 | keep |
| kn-Latn | filter en noise of | 688 | keep |
: : karnatake govt websites : : :
| kjh | ok .ru domain | 672 | keep |
| yap | ok | 638 | keep |
| ban | ok bible | 637 | keep |
| tuc | ok bible | 635 | keep |
| tcy | good; mostly wikipedia; | 632 | keep |
: : likely some konkani : : :
: : mixed in : : :
| cab | ok jw | 629 | keep |
| cak | ok bible | 617 | keep |
| din | ok after SD filter | 611 | keep |
| arn | good; bible | 593 | keep |
| lrc | ok | 587 | keep |
| gil | empty; but merged in | 586 | keep |
: : data in "cjk" : : :
| gil | this is all in gil | 586 | keep |
: : (Kiribati). merged into : : :
: : "gil" : : :
| rwo | bible | 572 | keep |
| hus | ok bible | 569 | keep |
| bum | ok bible; but wrong | 559 | keep |
: : language. Data is in : : :
: : Bulu, not Fang : : :
| mak | ok bible | 555 | keep |
| frp | fair amount from | 550 | keep |
: : wikipedia. : : :
| seh | ok jw | 545 | keep |
| twu | ok bible, but also i | 539 | keep |
: : think it's lots of mixed : : :
: : similar dialects : : :
| kmb | ok bible jw | 538 | keep |
| ksw | ok bible | 536 | keep |
| sja | ok bibe | 527 | keep |
| amu | good; bible; crazy | 511 | keep |
: : diacritics : : :
| mad | remove mostly short text | 509 | keep |
| quh | bible | 501 | keep |
| dyu | ok bible | 483 | keep |
| toj | ok jw | 452 | keep |
| ch | ok; not sure about WL | 449 | keep |
| sus | hella sus jk ok bible | 437 | keep |
| nog | ok | 419 | keep |
| jam | ok bible | 416 | keep |
| gui | ok bible | 409 | keep |
| nia | ok | 408 | keep |
| mas | ok some amount of bible | 405 | keep |
| bzj | ok bible | 404 | keep |
| mkn | ok bible | 402 | keep |
| lhu | ok bible | 377 | keep |
| ctu | ok bible | 366 | keep |
| kg | ok bible jw | 365 | keep |
| inb | ok bible | 343 | keep |
| guh | ok bible | 331 | keep |
| rn | bible | 323 | keep |
| bus | ok; bible; about 50bzc | 322 | keep |
| mfe | ok mostly bible maybe | 320 | keep |
: : some french creole short : : :
: : doc noise : : :
| sda | ok bible | 317 | keep |
| bi | good! fun! | 311 | keep |
| cr-Latn | noise and lorem ipsom. | 303 | keep |
: : But some ok Cree text. : : :
| gor | ok bible | 303 | keep |
| jac | ok bible | 303 | keep |
| chr | ok bible | 301 | keep |
| mh | ok jw lds | 296 | keep |
| mni | ok | 290 | keep |
| wal | ok bible + jw | 286 | keep |
| teo | ok bible | 274 | keep |
| gub | ok bible | 271 | keep |
| qvi | bible | 266 | keep |
| tdx | ok jw | 262 | keep |
| rki | ok | 251 | keep |
| djk | ok; bible+jw | 246 | keep |
| nr | ok | 246 | keep |
| zne | ok jw | 239 | keep |
| izz | ok bible | 237 | keep |
| noa | ok | 234 | keep |
| bqc | ok; bible | 228 | keep |
| srm | ok; bible + jw | 227 | keep |
| niq | ok | 226 | keep |
| bas | ok; has some fun blog | 216 | keep |
: : stuff! : : :
| dwr | ok; bible; mixed script | 215 | keep |
| guc | ok bible | 214 | keep |
| jvn | ok bible | 213 | keep |
| hvn | ok religioous text | 200 | keep |
| sxn | ok bible ; also wild | 197 | keep |
: : diacritics : : :
| koi | ok | 196 | keep |
| alz | good; bible | 195 | keep |
| nyu | ok | 195 | keep |
| bn-Latn | ok | 191 | keep |
| suz | | 186 | keep |
| pau | ok | 185 | keep |
| nij | ok | 183 | keep |
| sat-Latn | good! al from local news | 183 | keep |
: : sources : : :
| gu-Latn | filter short en | 179 | keep |
: : boilerplate and : : :
: : repetitive sentences : : :
| msm | ok bible | 177 | keep |
| maz | ok bible jw | 170 | keep |
| qxr | bible | 153 | keep |
| shp | ok bible | 150 | keep |
| hne | ok | 146 | keep |
| ktu | ok bible jw | 144 | keep |
| laj | ok bible | 144 | keep |
| pis | bible | 139 | keep |
| mag | ok fix virama issue | 138 | keep |
| gbm | ok | 137 | keep |
| tzj | ok bible | 136 | keep |
| oj | ok | 135 | keep |
| ndc-ZW | ok | 132 | keep |
| tks | ok bible bu again i | 127 | keep |
: : think some mixeed : : :
: : dialects : : :
| gvl | filter short boilerplate | 126 | keep |
: : mostly bible : : :
| knj | ok bible | 126 | keep |
| awa | all bible in awadhi | 126 | keep |
: : (awa). Renamed from bjj : : :
| spp | ok bible | 123 | keep |
| mqy | bible remove short docs | 119 | keep |
| tca | ok bible + jw | 117 | keep |
| cce | ok jw | 116 | keep |
| skr | ok; some pnb mixed in | 107 | keep |
| kmz-Latn | ok soome ar script noise | 106 | keep |
| dje | ok; mostly but not all | 100 | keep |
: : bible : : :
| gof | ok some bible | 97 | keep |
| agr | good; bible | 93 | keep |
| qvz | bible | 88 | keep |
| adh | good; bible | 87 | keep |
| quf | bible | 86 | keep |
| kjg | ok bible | 84 | keep |
| tsc | ok | 82 | keep |
| ber | ok great! | 79 | keep |
| ify | ok bible | 79 | keep |
| cbk | ok bible | 78 | keep |
| quy | bible | 78 | keep |
| ahk | good; bible; crazy | 77 | keep |
: : diacritics : : :
| cac | ok bible | 77 | keep |
| akb | good; bible | 71 | keep |
| nut | ok | 67 | keep |
| ffm | ok bible; mixed fulfulde | 65 | keep |
: : dialects; consider : : :
: : merging with ff : : :
| taj | ok bible | 65 | keep |
| ms-Arab | ok mostly utusanmelayu | 63 | keep |
: : website : : :
| brx | quite good! | 62 | keep |
| ann | good; all from wikimedia | 56 | keep |
: : incubator : : :
| qup | bible | 53 | keep |
| ms-Arab-BN | ok not sure if same as | 46 | keep |
: : ms-Arab : : :
| miq | ok | 45 | keep |
| msb | ok bible | 41 | keep |
| bim | good; bible | 40 | keep |
| raj | ok | 40 | keep |
| kwi | ok bible | 37 | keep |
| tll | ok jw | 37 | keep |
| trp | good ; lots of random | 36 | keep |
: : stuff : : :
| smt | ok bible but lots of | 34 | keep |
: : different bibles! : : :
| mrw | ok | 29 | keep |
| dln | ok bible | 28 | keep |
| qvc | bible | 27 | keep |
| doi | ok actually nice! | 26 | keep |
| ff | ok after shortfilter | 26 | keep |
| zh | very noisy | 19850947 | keep (filtered) |
| zh-Latn | poor quality | 602 | remove |
| rhg-Latn | remove | 10302 | remove |
| ja-Latn | remove maybe low quality | 7516 | remove |
: : short and repeated : : :
| pam | remove | 2773 | remove |
| za | revisit after | 1700 | remove |
: : shortfilter : : :
| ar-Latn | terrible, 0% orrect, | 1520 | remove |
: : remove : : :
| mnw | remove en noise and | 1100 | remove |
: : boilerplate : : :
| fip | ok jw ; but wrong | 729 | remove |
: : language. mostly : : :
: : Mambwe-Lungu and Bemba, : : :
: : as well as Fipu (mgr+bem : : :
: : vs. fip) : : :
| el-CY | bad; not Cypriote | 537 | remove |
| luz | terrible; remove | 354 | remove |
| cni | ok; bible; lots of mixed | 261 | remove |
: : in content in : : :
: : not,cob,cpc,arl : : :
| apd-SD | terribly questionable; | 227 | remove |
: : probably remove : : :
| mey | mostly short and noisy | 127 | remove |
: : borderline : : :
| awa | OK; should be used with | 126 | remove |
: : caution and suspicion : : :
| mtq | remove short doc | 111 | remove |
: : repetitive : : :
| mel | remove noisy en | 103 | remove |
| mr-Latn | remove mostly porn and | 91 | remove |
: : short docs : : :
| srr | remove ; english | 91 | remove |
: : boilerplate : : :
| en-Cyrl | ok ... some fr-Cyrl too | 90 | remove |
: : and maybe others : : :
| en-Arab | remove | 79 | remove |
| syl | idk maybe ok ? | 61 | remove |
| jax | filter mostly | 58 | remove |
: : text.medjugorje.ws : : :
: : boilerplate : : :
| xmm | very noisy lots of dj | 58 | remove |
: : tiktok and peppa pig : : :
: : repeated : : :
| shu | quite questionable. prob | 53 | remove |
: : remove : : :
| ks | ok shorter docs | 51 | remove |
| gyn | remove boilerplate and | 45 | remove |
: : porn : : :
| aa | some pretty bad data but | 32 | remove |
: : also some good data. : : :
: : filter on "Woo" (case : : :
: : sensitive) : : :
| sjp | terible; probably | 31 | remove |
: : remove; check again : : :
: : after short filter : : :
| abs | all short nonsense | 24 | remove |
: : remove : : :
| mui | remove short docs | 23 | remove |
| mdh | filter porn short text | 22 | remove |
: : and repetitive : : :
: : boilerplate : : :
| noe | ok | 22 | remove |
| sxu | rvisit after shortfilter | 22 | remove |
| bhb-Gujr | bad. remove. all junk | 20 | remove |
: : gu. : : :
| yaq | remove | 20 | remove |
| prk | ok | 18 | remove |
| cgg | rather noisy but | 17 | remove |
: : potentialy ok. not sure : : :
: : if WL or not : : :
| bto | bad; remove unless short | 16 | remove |
: : filter keeps enough : : :
| ayl | terrible | 13 | remove |
| pa-Arab | ok | 13 | remove |
| bmm | terrible. filter on | 11 | remove |
: : short and reevaluate : : :
| mfb | remove short boilerplate | 11 | remove |
| mtr | ok fix virama remove en | 11 | remove |
: : noise : : :
| pmy | remove | 11 | remove |
| skg | terrible; remove | 11 | remove |
| ymm | remove | 11 | remove |
| xnr | ok maybe fix virama | 9 | remove |
: : though it seems fine : : :
| kjb | ok bible | 8 | remove |
| azg | short noise; bible | 7 | remove |
| bgz | idk maybe ok but | 7 | remove |
: : probably bad : : :
| ctg | probably terrible | 7 | remove |
: : probably remove : : :
| nyo | ok | 7 | remove |
| mdy | ok bible | 6 | remove |
| syl-Latn | revist or remove after | 6 | remove |
: : shortfilter : : :
| xog | ok bible and stories | 6 | remove |
| cyo | terrifying noise; remove | 4 | remove |
| kfy | filter virama issue | 4 | remove |
| nd | ok | 4 | remove |
| rwr | remove | 4 | remove |
| tuf | ok bible | 4 | remove |
| clu | ok bible | 3 | remove |
| ng | ok | 3 | remove |
| zyj | deeply bad data .. | 3 | remove |
: : revisit after : : :
: : shortfilter : : :
| rkt | ok | 2 | remove |
| bgc | super sketch. Remove | 1 | remove |
: : unless short doc filter : : :
: : leaves some. remove : : :
| dcc | remove | 1 | remove |
| ff-Adlm | good | 1 | remove |
| gju | remove short boilerplate | 1 | remove |
| max | remove short some ru | 1 | remove |
| mwr | filter short docs fix | 1 | remove |
: : virama : : :
| trw | sus; remove | 1 | remove |
| vkt | 1 doc remove | 1 | remove |
| gjk | empty remove | 0 | remove |
| bfy | very bad. remove unless | 0 | remove |
: : it looks better after : : :
: : filtering short docs; : : :
: : remove : : :
| nyn | ok | 0 | remove |
| sgj | remove | 0 | remove |
A few comments too long to fit in the table above:
* `alt`: WAIT THIS IS AMAZING IT IS ACTUALLY ALTAI! e.g. from urls like
https://altaicholmon.ru/2020/02/28/jarashty-la-jajaltany-jarkyndu-lekeri/
* `tly-IR`: They all look like boilerplate content, e.g., list of
keywords/search queries used to bump page ranking in search results. Not any
useful material for translation. Remove.
* `zap`: pls note that at least some Zapotec speakers tend to view it as one
language, not as a million dialects like ISO does. However, some are
certainly mutually unintelligible, complicating the matter.
* `zh-Latn`: The biggest problem is that several examples are not in Latin
Chinese (i.e., romanization in my understanding) but in English or mixed
English and Chinese. For those data in Latin Chinese, their quality seems to
be good.
* `zh`: Many examples are porn-related, particularly those very long
documents. Also, there are some examples of traditional Chinese.
## Final Dataset information
The number of documents, sentences, tokens, characters, and bytes for the noisy
and clean splits of the data. Note that the "toks" field below uses whitespace
for tokenization, so is not appropriate for non-whitespace-separating languages
like Chinese (see section above). Note that the english subset in this version
is missing 18% of documents that were included in the published analysis of the dataset.
These documents will be incoporated in an update coming soon.
BCP-47 | docs (noisy) | docs (clean) | sents (noisy) | sents (clean) | toks (noisy) | toks (clean) | chars (noisy) | chars (clean) | clean | noisy |
----------------|:---------------|:---------------|:----------------|:----------------|:---------------|:---------------|:----------------|:----------------|:---------|:---------|
total* | 7.2B | 3.7B | 133.1B | 97.5B | 4.6T | 2.6T | 30.6T | 16.0T | 11.4 T | 6.3 T
en* | 3.0B | 1.5B | 71.1B | 45.4B | 2.0T | 1.3T | 12.3T | 7.6T | 2.6 T | 4.3 T |
ru | 823M | 402.5M | 823M | 12.4B | 416.5B | 240.9B | 3.1T | 1.8T | 832.9 G | 1.4 T |
es | 476.4M | 250.9M | 8.3B | 4.5B | 325.7B | 170.4B | 2.1T | 1.1T | 380.9 G | 747.5 G |
de | 478.6M | 225.1M | 11.5B | 6B | 299.5B | 139.6B | 2.2T | 1T | 370.6 G | 815.5 G |
fr | 384.2M | 218.9M | 7.9B | 5B | 307.1B | 165.2B | 2T | 1T | 370.4 G | 699.1 G |
it | 238.9M | 126.4M | 4.5B | 2.5B | 180.1B | 83.6B | 1.2T | 553.1B | 198.4 G | 429.6 G |
pt | 209.2M | 124.2M | 4B | 2.4B | 123.2B | 79.2B | 791.5B | 499.8B | 183.1 G | 289.6 G |
pl | 145.1M | 90.9M | 3.3B | 2.4B | 68.9B | 49.2B | 505B | 356.4B | 140.7 G | 202.5 G |
nl | 134.5M | 86.6M | 134.5M | 2.3B | 104.4B | 51.6B | 698.5B | 334.5B | 118.2 G | 247.5 G |
tr | 107M | 56.4M | 107M | 1.2B | 41.9B | 25B | 328.8B | 198.9B | 73.7 G | 123.9 G |
vi | 92.8M | 55M | 1.6B | 1B | 71.5B | 48.7B | 342B | 228.8B | 88.8 G | 133.9 G |
cs | 72.1M | 38.3M | 1.7B | 1B | 40.8B | 22.1B | 272.2B | 147.9B | 62.1 G | 112.7 G |
id | 120.9M | 38M | 2.2B | 747.5M | 60.4B | 20.2B | 443B | 148.3B | 48.5 G | 148.7 G |
ro | 60.8M | 35.4M | 60.8M | 746.4M | 37.1B | 22.9B | 244.1B | 148.2B | 55.5 G | 90.3 G |
sv | 65.2M | 35.2M | 65.2M | 1B | 62.1B | 23.9B | 422.6B | 153.7B | 57.0 G | 149.9 G |
hu | 47.6M | 29.7M | 1.3B | 806.3M | 29.8B | 17.8B | 223.6B | 134.9B | 53.5 G | 86.8 G |
uk | 46.6M | 25M | 1B | 599.9M | 21.6B | 12.8B | 164.2B | 95.2B | 45.1 G | 75.8 G |
fa | 58.1M | 23.1M | 920.6M | 493.5M | 40.6B | 18.4B | 220.4B | 96.7B | 43.4 G | 97.4 G |
ja | 23.3M | 21.8M | 326M | 321.6M | 10.9B | 10.9B | 133.3B | 132.2B | 98.7 G | 99.7 G |
el | 52.4M | 20.9M | 808M | 445.4M | 25B | 12B | 173.2B | 80.9B | 37.9 G | 80.8 G |
fi | 35.8M | 20.4M | 1B | 650.3M | 23.8B | 11.5B | 202.2B | 101.1B | 37.6 G | 74.1 G |
zh | 29.3M | 19.9M | 492.3M | 298.8M | 19.2B | 10B | 333B | 142.3B | 109.9 G | 191.8 G |
da | 38.5M | 17.9M | 1.1B | 508M | 37.7B | 13B | 252B | 83.1B | 29.4 G | 89.5 G |
th | 19M | 17.4M | 19M | 385.8M | 8.9B | 8.9B | 118.6B | 117.6B | 57.6 G | 58.2 G |
no | 34.7M | 14.9M | 34.7M | 498.7M | 46.6B | 11.8B | 305.6B | 74.8B | 27.3 G | 109.8 G |
bg | 27.2M | 12.8M | 599.4M | 360.3M | 14.4B | 8.8B | 95.6B | 57.8B | 26.0 G | 42.8 G |
ko | 19.7M | 12.7M | 628.6M | 471.8M | 13.3B | 9.3B | 65.9B | 43.8B | 34.2 G | 49.1 G |
ar | 67.6M | 12.4M | 876.6M | 182.6M | 39B | 7.1B | 243B | 43.2B | 20.9 G | 115.9 G |
sk | 23.2M | 11.9M | 487.9M | 300.6M | 11.3B | 6.7B | 77.8B | 45.7B | 18.8 G | 31.9 G |
ca | 17.9M | 9.5M | 258.6M | 153M | 8.9B | 5.6B | 56.5B | 34.6B | 12.6 G | 20.8 G |
lt | 15.3M | 8.7M | 374M | 256.9M | 7.5B | 5.3B | 58.6B | 41.3B | 15.7 G | 22.3 G |
he | 14.1M | 7.2M | 302.2M | 196.8M | 9.2B | 5.2B | 54.9B | 30.5B | 14.8 G | 26.3 G |
sl | 12M | 6.3M | 316M | 180M | 6.9B | 4.5B | 47.8B | 30.5B | 11.5 G | 18.0 G |
et | 8.8M | 5.5M | 223.8M | 176.3M | 5B | 3.6B | 40.1B | 28.7B | 10.7 G | 15.0 G |
lv | 8.4M | 5M | 186.1M | 138.5M | 4.8B | 3.2B | 36.7B | 23.9B | 9.1 G | 13.8 G |
hi | 9.9M | 4.5M | 254.4M | 152M | 7.4B | 3.8B | 39.9B | 20.1B | 9.9 G | 19.7 G |
sq | 5.5M | 3.6M | 5.5M | 56.1M | 2.7B | 2.1B | 17B | 12.7B | 4.8 G | 6.6 G |
az | 5.2M | 3.3M | 90.3M | 70.9M | 2.1B | 1.5B | 16.3B | 11.9B | 4.5 G | 6.3 G |
hr | 23M | 2.8M | 476.6M | 53M | 12.6B | 1.4B | 85.1B | 9.6B | 3.7 G | 33.5 G |
ta | 5.6M | 2.6M | 122.5M | 81.9M | 2.1B | 1.1B | 19.2B | 10.6B | 4.9 G | 8.8 G |
ms | 14.1M | 2.3M | 14.1M | 55.2M | 8B | 1.7B | 58.8B | 12.5B | 4.0 G | 20.4 G |
ml | 3.7M | 2.1M | 75M | 52M | 1B | 603.3M | 10.5B | 6.3B | 3.0 G | 5.1 G |
sr | 4.7M | 2M | 4.7M | 64M | 2.7B | 1.6B | 18.6B | 11B | 5.1 G | 8.7 G |
kk | 3.1M | 1.8M | 87.4M | 59.1M | 1.6B | 1B | 13.4B | 8.6B | 3.8 G | 5.8 G |
te | 2.5M | 1.7M | 59M | 46.4M | 900.2M | 618.5M | 7.4B | 5.1B | 2.6 G | 3.8 G |
mr | 2.9M | 1.7M | 2.9M | 50M | 1.2B | 776.9M | 8.7B | 5.5B | 2.8 G | 4.4 G |
is | 2.9M | 1.6M | 73.7M | 39.3M | 2.1B | 979.2M | 14.9B | 6.4B | 2.5 G | 5.9 G |
bs | 12.9M | 1.4M | 163.6M | 9M | 5.9B | 490.9M | 39.5B | 3.3B | 1.3 G | 15.6 G |
mk | 2.9M | 1.4M | 41.3M | 22.6M | 1.3B | 685.9M | 9.1B | 4.5B | 2.0 G | 4.0 G |
gl | 4.2M | 1.3M | 45.3M | 18.8M | 2.3B | 748.4M | 15.6B | 4.8B | 1.7 G | 5.5 G |
eu | 2.1M | 1.2M | 41.7M | 24.8M | 827.5M | 525.3M | 6.9B | 4.3B | 1.5 G | 2.4 G |
bn | 4.3M | 1.1M | 151.2M | 38.6M | 2.5B | 645.7M | 16.8B | 4.3B | 2.2 G | 8.7 G |
be | 2M | 1.1M | 48.8M | 31.3M | 981M | 632.9M | 7.2B | 4.6B | 2.2 G | 3.5 G |
ka | 3.1M | 936.5K | 53.7M | 26.6M | 1.2B | 460.8M | 10.3B | 3.8B | 1.9 G | 5.0 G |
fil | 4.2M | 901.5K | 67.4M | 19.2M | 2.2B | 741.7M | 14.6B | 4.7B | 1.5 G | 5.0 G |
mn | 2.2M | 879.9K | 43.3M | 24M | 1.1B | 487.5M | 7.9B | 3.5B | 1.6 G | 3.5 G |
af | 2.9M | 868.7K | 51.9M | 30M | 1.7B | 795M | 11.8B | 4.8B | 1.8 G | 4.2 G |
uz | 1.4M | 669.9K | 25.7M | 17.5M | 605.9M | 388.3M | 5.2B | 3.3B | 1.1 G | 1.9 G |
gu | 1.3M | 659.7K | 28.9M | 18.1M | 634.4M | 345.9M | 3.9B | 2.1B | 1.1 G | 2.0 G |
kn | 1.6M | 657.8K | 32.9M | 19.2M | 546.4M | 258.6M | 4.6B | 2.2B | 1.1 G | 2.3 G |
kaa | 1.1M | 586.4K | 19.8M | 13.3M | 455.9M | 269M | 3.8B | 2.2B | 990.2 M | 1.6 G |
sw | 1.3M | 537.8K | 1.3M | 9.5M | 660.7M | 345.8M | 4.6B | 2.4B | 826.1 M | 1.6 G |
ur | 967.2K | 467.2K | 29M | 18.4M | 1B | 562.5M | 5.2B | 2.7B | 1.2 G | 2.4 G |
ne | 876.4K | 453.3K | 876.4K | 20.4M | 585M | 345.3M | 3.9B | 2.2B | 1.1 G | 1.9 G |
cy | 4.9M | 430.7K | 68.3M | 7.4M | 3.6B | 275.6M | 26.4B | 1.7B | 609.5 M | 10.0 G |
hy | 2M | 397.5K | 31.1M | 9.9M | 1B | 190.9M | 8.1B | 1.5B | 678.9 M | 3.6 G |
ky | 751.1K | 367.6K | 14.3M | 9.6M | 303.4M | 181.6M | 2.5B | 1.4B | 665.1 M | 1.1 G |
si | 788K | 349.2K | 22.1M | 16M | 507.3M | 293.3M | 3.4B | 1.9B | 1023.6 M | 1.8 G |
tt | 2.1M | 346.9K | 60.2M | 8.6M | 1B | 135M | 12.1B | 1B | 494.1 M | 4.6 G |
tg | 789.2K | 328.2K | 789.2K | 7.4M | 363.8M | 208.8M | 2.6B | 1.4B | 635.7 M | 1.1 G |
la | 2.9M | 319.2K | 85.7M | 13.8M | 1.1B | 218.4M | 8.2B | 1.5B | 550.6 M | 2.9 G |
so | 729.2K | 293.2K | 729.2K | 3.1M | 294.8M | 146.3M | 2.1B | 992.4M | 350.8 M | 746.2 M |
ga | 5.3M | 286K | 31.7M | 6.9M | 4.2B | 229.3M | 30.6B | 1.4B | 500.7 M | 9.8 G |
km | 297.8K | 285.7K | 5M | 5M | 53M | 52.6M | 1.1B | 1.1B | 566.2 M | 570.0 M |
mt | 1.2M | 265.4K | 1.2M | 5.6M | 390.4M | 171.5M | 3.2B | 1.3B | 467.4 M | 1.1 G |
eo | 1.4M | 260K | 33.9M | 9.3M | 745.1M | 253.1M | 5.5B | 1.7B | 627.6 M | 1.9 G |
ps | 429.9K | 252.9K | 5.1M | 3.6M | 293.9M | 177.5M | 1.4B | 848.9M | 403.5 M | 682.9 M |
rw | 681.8K | 226.5K | 681.8K | 1.9M | 225M | 99.8M | 1.7B | 749.1M | 264.8 M | 702.4 M |
ku | 671.9K | 218.9K | 10.7M | 4.9M | 305.3M | 143.8M | 2.1B | 849.9M | 335.3 M | 791.9 M |
lo | 229.1K | 216K | 2.9M | 2.8M | 41.7M | 41.1M | 706.9M | 697.6M | 365.3 M | 370.8 M |
fy | 1.7M | 210K | 12.1M | 3.7M | 506.9M | 94M | 3.7B | 592.3M | 223.0 M | 1.2 G |
ha | 443.9K | 173.5K | 4.5M | 2.4M | 206.5M | 109.3M | 1.3B | 630.2M | 219.0 M | 478.1 M |
my | 176.5K | 172.4K | 176.5K | 10.1M | 96.6M | 96.3M | 1.3B | 1.3B | 648.8 M | 650.4 M |
dv | 264.4K | 167.2K | 4.3M | 3.5M | 92.8M | 64M | 877.3M | 603.1M | 238.3 M | 343.2 M |
pa | 368.2K | 150.6K | 368.2K | 6M | 306M | 152.8M | 1.6B | 797.1M | 414.1 M | 857.6 M |
ckb | 622.7K | 148.9K | 5.6M | 2.5M | 312.7M | 83.3M | 2.2B | 572.7M | 265.0 M | 1011.1 M |
lb | 7.6M | 146K | 47.1M | 3.4M | 7.5B | 85M | 58.4B | 575.5M | 218.4 M | 22.2 G |
mg | 295.2K | 115.4K | 4.5M | 2.6M | 189.4M | 75.5M | 1.3B | 548.5M | 179.0 M | 429.3 M |
ht | 425.6K | 110.4K | 6.7M | 2.6M | 163M | 84.3M | 994.5M | 461.5M | 168.2 M | 361.5 M |
ug | 227.1K | 106.5K | 4.5M | 3.1M | 122.9M | 62.7M | 998.5M | 504.6M | 233.1 M | 449.9 M |
am | 245.2K | 106.3K | 7.1M | 5.3M | 157M | 95.2M | 869.9M | 509M | 345.5 M | 539.4 M |
or | 139.6K | 100.5K | 139.6K | 3.1M | 66M | 47.3M | 437.2M | 309.5M | 160.3 M | 228.1 M |
fo | 382.9K | 97.8K | 3.9M | 1.8M | 136.5M | 48.9M | 923.3M | 314.9M | 122.0 M | 328.8 M |
gd | 206K | 94.3K | 3.7M | 2.4M | 127.6M | 84.5M | 812M | 526M | 173.4 M | 276.6 M |
ba | 372.4K | 90.3K | 9.3M | 2.6M | 101M | 42.1M | 766.5M | 320.7M | 154.8 M | 352.4 M |
tk | 180.2K | 82.5K | 180.2K | 1.8M | 65.4M | 43.3M | 575.2M | 369M | 131.3 M | 221.6 M |
mi | 711.9K | 79.5K | 5.9M | 1.9M | 262.5M | 73.5M | 1.6B | 371.9M | 120.2 M | 539.1 M |
hmn | 241.3K | 75.2K | 3.5M | 1.9M | 192.1M | 80.2M | 1.2B | 408.8M | 124.3 M | 366.0 M |
grc | 364.8K | 70.7K | 13.7M | 2.8M | 298.6M | 65.3M | 2B | 417.8M | 217.7 M | 1.0 G |
jv | 999.5K | 69.5K | 13M | 2M | 302.3M | 52.1M | 2.3B | 376.1M | 130.9 M | 797.8 M |
ceb | 617.5K | 66.2K | 6.7M | 1.6M | 225M | 58.2M | 1.5B | 357.7M | 116.2 M | 451.4 M |
sd | 115.6K | 65.9K | 115.6K | 2.4M | 112.6M | 77.8M | 561M | 380.4M | 182.3 M | 267.1 M |
yi | 160.6K | 64.9K | 3.3M | 1.9M | 129.1M | 53.9M | 838.4M | 352.6M | 146.0 M | 350.8 M |
kaa_Latn | 375.2K | 61.2K | 3.6M | 1.3M | 375.2K | 61.2K | 1.5M | 209.5K | 86.2 M | 264.6 M |
sn | 3.1M | 60.2K | 3.1M | 1.2M | 1.3B | 31.6M | 10.6B | 266M | 92.5 M | 3.2 G |
co | 546.7K | 55.4K | 6.1M | 1.3M | 172.6M | 43.6M | 1.1B | 265.5M | 98.8 M | 386.8 M |
su | 336.6K | 55K | 336.6K | 1.6M | 154M | 39.5M | 967.2M | 286.7M | 100.7 M | 308.5 M |
pap | 259.1K | 54.5K | 259.1K | 1.4M | 183.9M | 41.1M | 1.4B | 229.9M | 83.5 M | 451.4 M |
ig | 130.4K | 54.4K | 2.1M | 1.4M | 129.2M | 45.7M | 846.1M | 251.4M | 93.0 M | 178.9 M |
zu | 372.3K | 53.8K | 3.8M | 1.2M | 148.4M | 27.2M | 1.2B | 257.4M | 89.6 M | 374.7 M |
xh | 310.9K | 53.7K | 2.9M | 1.4M | 81.6M | 31.2M | 749.5M | 287.3M | 100.0 M | 319.1 M |
sm | 137.8K | 52.6K | 1.9M | 1.3M | 100.9M | 53.7M | 607.9M | 276.3M | 88.6 M | 184.5 M |
ny | 181.6K | 52.2K | 181.6K | 1.5M | 80.6M | 34.8M | 611.2M | 277.5M | 91.8 M | 209.8 M |
yo | 115K | 52.1K | 2M | 1.2M | 76.6M | 46.3M | 415.6M | 239M | 89.2 M | 157.8 M |
cv | 599.4K | 47.3K | 12M | 1.6M | 169.6M | 22.2M | 1B | 168.9M | 82.1 M | 413.6 M |
el_Latn | 497.3K | 46.4K | 11.3M | 1.7M | 497.3K | 46.4K | 2.3M | 162.8K | 196.8 M | 571.1 M |
kl | 85.9K | 46K | 2.1M | 1.5M | 32.3M | 22.3M | 403.9M | 279.1M | 84.2 M | 126.1 M |
haw | 310.4K | 45.7K | 7.1M | 1M | 141M | 43.3M | 892M | 214.2M | 69.9 M | 271.2 M |
gsw | 7.6M | 42.7K | 64.5M | 1M | 5B | 22.3M | 42.3B | 149.2M | 53.8 M | 13.5 G |
tet | 291K | 40.4K | 1.9M | 475.7K | 240.6M | 22.8M | 1.6B | 152.3M | 51.2 M | 455.4 M |
st | 96.8K | 40.4K | 96.8K | 1.1M | 65M | 39.8M | 381.5M | 226.9M | 74.0 M | 127.0 M |
lus | 91.5K | 36.4K | 1.4M | 863.5K | 53M | 31.3M | 298.3M | 167.3M | 60.1 M | 107.0 M |
oc | 2.4M | 36.4K | 2.4M | 1.6M | 887.6M | 26.7M | 6.7B | 177.6M | 58.7 M | 1.9 G |
as | 53.9K | 33.8K | 2.4M | 1.7M | 41.4M | 27.9M | 275.8M | 182.1M | 95.8 M | 146.1 M |
rm | 238.1K | 33.8K | 238.1K | 603.4K | 59.2M | 15.8M | 391M | 100.2M | 34.6 M | 133.1 M |
br | 705.4K | 33.2K | 7.8M | 731.7K | 646.8M | 21M | 3.7B | 125.4M | 46.2 M | 1.2 G |
sah | 1.3M | 29.2K | 1.3M | 1.2M | 283.7M | 17.6M | 2.2B | 148.2M | 68.3 M | 852.3 M |
hi_Latn | 1.2M | 26.7K | 22.6M | 1.2M | 1.2M | 26.7K | 5.3M | 98.9K | 53.5 M | 1.7 G |
se | 54.3K | 23.9K | 879.5K | 493.3K | 17.7M | 10M | 148.4M | 84.6M | 31.1 M | 56.6 M |
cnh | 44.4K | 21.6K | 688.6K | 406.9K | 21.6M | 12.5M | 110.8M | 63M | 22.1 M | 39.6 M |
om | 846.1K | 18.9K | 846.1K | 469.8K | 238M | 11.2M | 1.9B | 88.5M | 30.4 M | 881.5 M |
ce | 59.3K | 15K | 991.1K | 460.1K | 17.8M | 9.6M | 130.6M | 67.8M | 31.1 M | 60.2 M |
udm | 67.1K | 13.4K | 942.7K | 510.3K | 14M | 7.4M | 106M | 55.5M | 26.3 M | 49.2 M |
lg | 61.1K | 13K | 510.9K | 166.1K | 21.4M | 6.1M | 160.7M | 48M | 17.3 M | 56.7 M |
os | 172.1K | 12.6K | 172.1K | 359.3K | 27.1M | 6.9M | 233.5M | 50.1M | 23.1 M | 87.7 M |
nv | 17.1K | 12.6K | 17.1K | 86.5K | 3.1M | 1.1M | 24.8M | 9.1M | 2.0 M | 7.9 M |
kha | 37.8K | 12.1K | 235.5K | 75.2K | 15.8M | 6M | 88.6M | 30.2M | 9.8 M | 27.3 M |
ilo | 69.8K | 11.8K | 889.2K | 365.1K | 26.7M | 9M | 187.9M | 59.4M | 20.6 M | 64.0 M |
ctd_Latn | 23.3K | 11.6K | 575.6K | 382.2K | 23.3K | 11.6K | 90.7K | 41K | 21.5 M | 35.1 M |
vec | 1.1M | 11.1K | 10M | 209.7K | 284.7M | 7.8M | 1.8B | 43.8M | 17.7 M | 625.0 M |
hil | 126.8K | 10.6K | 1.1M | 379.7K | 43.9M | 9.2M | 293.5M | 57.2M | 18.5 M | 95.2 M |
tyv | 61.6K | 9.1K | 596.6K | 268.3K | 9.9M | 4.7M | 80.2M | 38.5M | 16.7 M | 36.6 M |
iba | 34K | 7.6K | 326.9K | 126.1K | 37.8M | 4.8M | 251.4M | 30.5M | 10.0 M | 61.3 M |
ru_Latn | 346.3K | 7.5K | 346.3K | 239.1K | 346.3K | 7.5K | 1.5M | 27.7K | 14.9 M | 452.3 M |
kbd | 154.7K | 7.5K | 1.4M | 257.2K | 31.9M | 4.4M | 321.4M | 36.8M | 16.8 M | 209.6 M |
ti | 20.8K | 7.3K | 20.8K | 481.3K | 18.2M | 8.8M | 95.4M | 44.6M | 30.9 M | 63.6 M |
sa | 154.3K | 7.1K | 154.3K | 1.1M | 70M | 9.9M | 512.5M | 88.8M | 44.9 M | 236.6 M |
av | 107.6K | 6.3K | 806.1K | 190.1K | 15.5M | 3.4M | 129M | 30.2M | 12.8 M | 56.0 M |
bo | 6.2K | 6.2K | 1.1M | 1.1M | 3.4M | 3.4M | 88.7M | 88.7M | 40.7 M | 40.7 M |
zza | 370.1K | 6K | 3.3M | 229.2K | 87.7M | 3.9M | 617.3M | 26.3M | 10.0 M | 234.1 M |
ber_Latn | 480.5K | 5.6K | 10.5M | 169.4K | 480.5K | 5.6K | 2.1M | 18.9K | 11.0 M | 945.3 M |
otq | 17.6K | 5.6K | 17.6K | 114.8K | 10.2M | 3.8M | 65M | 23.4M | 7.7 M | 22.8 M |
te_Latn | 236.6K | 5.3K | 4.4M | 269.1K | 236.6K | 5.3K | 1M | 19.3K | 11.4 M | 254.3 M |
bua | 9.8K | 5.3K | 252K | 144.6K | 4.7M | 2.7M | 38M | 21.7M | 10.0 M | 17.9 M |
ts | 34.7K | 5.2K | 34.7K | 248.6K | 39.6M | 6.5M | 377.2M | 38.8M | 12.2 M | 99.5 M |
cfm | 9.1K | 4.9K | 199.6K | 128.6K | 6.2M | 4M | 32.9M | 21.5M | 7.4 M | 11.6 M |
tn | 138.2K | 4.8K | 138.2K | 174.4K | 46M | 5.5M | 302.3M | 29.2M | 9.4 M | 99.0 M |
krc | 359.5K | 4.8K | 2.3M | 153.9K | 50.2M | 2.6M | 369.5M | 20.7M | 9.1 M | 139.9 M |
ak | 19.5K | 4.8K | 341.7K | 210.2K | 12.3M | 4.7M | 74.5M | 24.8M | 9.1 M | 24.7 M |
meo | 790.7K | 4.7K | 16.5M | 39K | 478M | 1.2M | 3B | 7.5M | 3.1 M | 1.2 G |
chm | 81.5K | 4.7K | 929.1K | 179.7K | 17.2M | 2.9M | 132.2M | 21.3M | 9.8 M | 53.5 M |
to | 14.3K | 4.6K | 14.3K | 149K | 10.3M | 5.7M | 58.2M | 29.9M | 9.6 M | 19.0 M |
ee | 14.1K | 4.5K | 353.6K | 246.7K | 9.7M | 6.2M | 67.9M | 32.8M | 11.8 M | 23.3 M |
nso | 376.2K | 4.4K | 376.2K | 188.4K | 419.2M | 5.3M | 2B | 28.2M | 9.1 M | 502.7 M |
ady | 74.9K | 4.2K | 446.8K | 96.9K | 8M | 1.6M | 67.9M | 14.8M | 6.4 M | 30.6 M |
rom | 22.9K | 4.2K | 22.9K | 76.1K | 8.9M | 2.6M | 59M | 15.9M | 5.8 M | 21.0 M |
bho | 13.6K | 4.1K | 306.2K | 118.5K | 7.1M | 2.7M | 37.6M | 13.4M | 7.4 M | 20.6 M |
ltg | 13.1K | 4.1K | 213.7K | 87.3K | 4M | 1.9M | 29.2M | 13.9M | 5.6 M | 11.7 M |
fj | 17K | 4K | 410K | 164.1K | 11.6M | 5.2M | 67.7M | 28M | 8.6 M | 22.5 M |
yua | 10.4K | 4K | 141.6K | 77.6K | 5.2M | 2.5M | 36.8M | 17.2M | 5.7 M | 12.4 M |
gn | 87.1K | 3.9K | 770.9K | 162.6K | 19.2M | 2.7M | 140.7M | 20.8M | 7.8 M | 52.1 M |
az_RU | 6.5K | 3.8K | 231.8K | 177.3K | 6.5K | 3.8K | 24K | 12.9K | 10.3 M | 15.1 M |
ln | 94.7K | 3.3K | 718.7K | 139K | 42.4M | 3.4M | 291.8M | 21.5M | 6.8 M | 85.3 M |
ada | 6.5K | 3.1K | 291.5K | 199.2K | 7.5M | 4.9M | 38.9M | 24.2M | 8.6 M | 13.9 M |
myv | 164.8K | 3.1K | 164.8K | 130K | 16M | 1.7M | 120.3M | 13.8M | 6.2 M | 49.5 M |
bik | 44.8K | 3.1K | 376.7K | 77K | 14.8M | 2.5M | 102.3M | 15.7M | 5.3 M | 34.0 M |
tlh | 516.9K | 3.1K | 516.9K | 46.9K | 221.3M | 1.1M | 1.4B | 7.8M | 2.7 M | 554.2 M |
kbp | 5.9K | 3K | 247.9K | 128.3K | 5.6M | 2.6M | 30.8M | 14.6M | 5.7 M | 12.4 M |
war | 1M | 2.9K | 114M | 96.2K | 612.1M | 2.4M | 3.5B | 16.1M | 3.7 M | 1.2 G |
wa | 70.6K | 2.8K | 1.5M | 127.2K | 35.2M | 3.6M | 198.8M | 20.4M | 7.2 M | 67.8 M |
bew | 311.1K | 2.7K | 10.4M | 58.4K | 212.4M | 1.3M | 1.4B | 8.5M | 3.1 M | 547.1 M |
rcf | 21.6K | 2.6K | 21.6K | 50.5K | 4.9M | 1.2M | 30.2M | 5.7M | 2.1 M | 11.4 M |
ta_Latn | 260.7K | 2.6K | 3.4M | 142.7K | 260.7K | 2.6K | 1.2M | 9.1K | 5.0 M | 215.4 M |
kac | 5.9K | 2.6K | 109.2K | 77.4K | 5M | 2.8M | 26.6M | 13.6M | 4.3 M | 8.0 M |
iu | 5.4K | 2.5K | 92.6K | 53.1K | 1.9M | 907.4K | 17.5M | 8.3M | 4.8 M | 9.9 M |
ay | 8.1K | 2.5K | 196.7K | 83.8K | 3.9M | 1.4M | 34.5M | 13.1M | 4.5 M | 12.7 M |
kum | 4.2K | 2.5K | 132.2K | 89.7K | 2.3M | 1.6M | 18.2M | 12.4M | 5.3 M | 8.0 M |
qu | 149.7K | 2.4K | 1M | 87K | 26.7M | 1.3M | 200.6M | 12.2M | 4.0 M | 68.3 M |
bgp | 355.7K | 2.4K | 5.6M | 43.3K | 186.1M | 1.8M | 1.1B | 9.8M | 3.1 M | 377.5 M |
hif | 702K | 2.4K | 7.9M | 124.7K | 1.2B | 3.2M | 9.1B | 19.1M | 5.9 M | 3.5 G |
kw | 176.9K | 2.3K | 1M | 51.6K | 53.1M | 1.3M | 327.8M | 7.7M | 2.8 M | 89.2 M |
nan_Latn_TW | 7.4K | 2.3K | 7.4K | 72.7K | 7.4K | 2.3K | 28.3K | 7.7K | 4.8 M | 15.4 M |
srn | 16.7K | 2.3K | 16.7K | 139.5K | 8M | 3.4M | 49.1M | 17M | 5.1 M | 15.6 M |
tly_IR | 406.3K | 2.2K | 406.3K | 18.2K | 406.3K | 2.2K | 1.6M | 8.6K | 580.4 K | 283.0 M |
sg | 4.2K | 2.1K | 154K | 117.9K | 4.6M | 3.3M | 22.6M | 15.5M | 4.6 M | 6.8 M |
gom | 4.6K | 2.1K | 178.3K | 108K | 2.7M | 1.4M | 19.8M | 10M | 5.0 M | 10.5 M |
ml_Latn | 260.8K | 2.1K | 3.5M | 77.3K | 260.8K | 2.1K | 1.1M | 7.2K | 3.5 M | 277.7 M |
kj | 112.2K | 2.1K | 881.8K | 22.6K | 46.9M | 877.3K | 339.6M | 6M | 2.1 M | 104.9 M |
ksd | 14.9K | 2K | 533K | 78.6K | 11.5M | 2.1M | 62.4M | 10M | 2.9 M | 20.0 M |
dz | 1.9K | 1.9K | 191.7K | 191.7K | 1.1M | 1.1M | 22.7M | 22.7M | 10.0 M | 10.0 M |
kv | 59.1K | 1.9K | 584.3K | 88.8K | 9.5M | 1.2M | 91.4M | 9M | 4.4 M | 41.0 M |
msi | 686.7K | 1.9K | 686.7K | 22.6K | 414.8M | 440.4K | 2.6B | 2.7M | 1.1 M | 1.0 G |
ve | 3.8K | 1.9K | 97.8K | 79.4K | 3.2M | 2.1M | 19M | 11.7M | 3.8 M | 6.2 M |
zap | 5.5K | 1.8K | 202.3K | 93.5K | 4.2M | 1.8M | 26.4M | 11.4M | 4.0 M | 9.6 M |
zxx_xx_dtynoise | 118.8K | 1.8K | 3.8M | 49.3K | 118.8K | 1.8K | 501K | 6.6K | 3.9 M | 367.0 M |
meu | 5.9K | 1.7K | 232.1K | 72.6K | 4.2M | 1.4M | 27.2M | 8.6M | 2.6 M | 9.1 M |
iso | 3.7K | 1.7K | 155.8K | 111.5K | 4.4M | 2.7M | 23M | 13.7M | 4.9 M | 8.1 M |
ium | 100.3K | 1.7K | 6.2M | 54.9K | 48.4M | 1.7M | 314M | 7.4M | 2.6 M | 124.0 M |
nhe | 3K | 1.7K | 3K | 57.7K | 1.9M | 1.2M | 15.6M | 9.8M | 2.7 M | 4.8 M |
tyz | 8K | 1.7K | 454.8K | 104.6K | 7.5M | 1.9M | 46.3M | 11.3M | 3.8 M | 16.0 M |
hui | 2K | 1.7K | 80.1K | 74.7K | 1.8M | 1.7M | 11.8M | 10.9M | 3.0 M | 3.3 M |
new | 6.6K | 1.6K | 6.6K | 85K | 3.2M | 1.4M | 21.2M | 8.8M | 4.4 M | 10.6 M |
mdf | 71K | 1.6K | 394.7K | 45.1K | 8.3M | 670.1K | 65.8M | 5.5M | 2.5 M | 26.7 M |
pag | 49.6K | 1.6K | 49.6K | 88.8K | 13.8M | 1.9M | 92.9M | 12M | 3.9 M | 29.2 M |
gv | 501.9K | 1.6K | 18.8M | 26.9K | 137.7M | 996.2K | 933.1M | 6.2M | 2.0 M | 318.6 M |
gag | 33.9K | 1.6K | 491K | 37K | 10.2M | 661K | 84.9M | 5.2M | 2.1 M | 32.6 M |
ngu | 3.8K | 1.5K | 3.8K | 87.1K | 2.7M | 1.5M | 21.4M | 11.8M | 3.6 M | 6.7 M |
quc | 4.4K | 1.5K | 89.2K | 41.2K | 2.8M | 1.1M | 16.6M | 6.4M | 2.2 M | 5.9 M |
mam | 23K | 1.5K | 446.3K | 52.9K | 9.8M | 1.2M | 70.4M | 7.2M | 2.6 M | 30.7 M |
min | 28.2K | 1.5K | 500.9K | 75.6K | 10.2M | 1.4M | 70.5M | 9.9M | 2.6 M | 21.1 M |
ho | 2K | 1.5K | 57K | 47.8K | 1.8M | 1.3M | 12.3M | 7.8M | 1.9 M | 3.1 M |
pon | 5.7K | 1.5K | 167.8K | 48.7K | 3M | 1.1M | 18.3M | 6.7M | 2.1 M | 6.1 M |
mrj | 97.1K | 1.4K | 97.1K | 60.3K | 14.5M | 1.1M | 100.6M | 7.6M | 3.6 M | 40.8 M |
lu | 10.6K | 1.4K | 316K | 112.1K | 7.8M | 2.3M | 54.2M | 15.4M | 4.8 M | 18.0 M |
gom_Latn | 231.1K | 1.4K | 4.1M | 77.9K | 231.1K | 1.4K | 1M | 5.1K | 3.6 M | 240.6 M |
alt | 2.6K | 1.4K | 110.1K | 65.9K | 1.8M | 1.1M | 14.3M | 8.7M | 3.8 M | 6.4 M |
nzi | 2.5K | 1.4K | 2.5K | 71.8K | 2.5M | 1.7M | 14.4M | 9.4M | 3.1 M | 4.8 M |
tzo | 2.8K | 1.4K | 100.4K | 75.7K | 2.5M | 1.7M | 15.9M | 10.6M | 3.2 M | 4.9 M |
bci | 7.4K | 1.3K | 124.8K | 87.1K | 5M | 1.9M | 32.8M | 9M | 3.1 M | 9.4 M |
dtp | 4.6K | 1.3K | 51.2K | 7.9K | 1.9M | 419.4K | 12.7M | 3M | 1013.9 K | 4.5 M |
abt | 1.6K | 1.3K | 122.7K | 110.3K | 1.5M | 1.3M | 9.6M | 8.2M | 2.2 M | 2.7 M |
bbc | 72.3K | 1.3K | 718.3K | 73.2K | 21.7M | 1.7M | 151.3M | 10.6M | 3.6 M | 47.9 M |
pck | 8.9K | 1.3K | 8.9K | 69.7K | 6.8M | 2.1M | 39.8M | 11.5M | 4.2 M | 14.2 M |
mai | 54.3K | 1.2K | 1M | 60.2K | 24.6M | 1.2M | 156M | 6.8M | 3.6 M | 67.1 M |
mps | 2.7K | 1.2K | 132.8K | 71.9K | 2.8M | 1.6M | 16M | 8.7M | 2.3 M | 4.8 M |
emp | 3.6K | 1.2K | 106.4K | 75.4K | 1.9M | 999.1K | 14.5M | 7.4M | 2.4 M | 4.9 M |
mgh | 5.5K | 1.2K | 151.8K | 61.2K | 2.8M | 1.1M | 24.1M | 8.2M | 2.8 M | 8.3 M |
tab | 7.8K | 1.2K | 226.4K | 26.8K | 4.3M | 538.9K | 33.7M | 4.4M | 1.9 M | 15.7 M |
crh | 5.1K | 1.2K | 170.9K | 61.8K | 2.4M | 943K | 18.8M | 7.5M | 3.4 M | 8.9 M |
tbz | 5.1K | 1.1K | 128.7K | 37.5K | 3.5M | 893.4K | 22M | 4.8M | 1.9 M | 10.2 M |
ss | 8.1K | 1.1K | 8.1K | 30.4K | 2.7M | 568.3K | 23.7M | 5.5M | 1.8 M | 7.4 M |
chk | 2.8K | 1.1K | 98.8K | 44K | 2M | 1M | 12M | 5.8M | 1.8 M | 4.0 M |
bru | 3K | 1.1K | 89.7K | 48.2K | 2.4M | 938.1K | 12.9M | 4.8M | 1.5 M | 4.5 M |
nnb | 4.9K | 1.1K | 4.9K | 70.2K | 3.2M | 1.2M | 27.7M | 9.1M | 3.3 M | 10.0 M |
fon | 5.3K | 1.1K | 222.9K | 67.3K | 6.9M | 1.8M | 34M | 8.3M | 3.1 M | 14.8 M |
ppk | 2.6K | 1.1K | 85.8K | 34.9K | 1.9M | 801.8K | 13.2M | 5.5M | 1.6 M | 4.3 M |
tiv | 3.8K | 1.1K | 3.8K | 80.7K | 3.7M | 2.1M | 20.4M | 10.2M | 3.2 M | 6.0 M |
btx | 3.1K | 1K | 81.7K | 43.9K | 2M | 907.5K | 13.1M | 5.9M | 2.0 M | 4.6 M |
bg_Latn | 200.4K | 991 | 2.8M | 25.5K | 200.4K | 991 | 927.1K | 3.7K | 1.7 M | 143.6 M |
mbt | 1.6K | 969 | 86K | 45.4K | 2.4M | 1.3M | 14.6M | 7.5M | 2.2 M | 5.1 M |
ace | 65.5K | 966 | 632.5K | 32.5K | 19.9M | 1.1M | 146.1M | 7.4M | 2.2 M | 42.3 M |
tvl | 2.3K | 933 | 72.9K | 53.6K | 2.5M | 1.7M | 12.6M | 8.1M | 2.4 M | 3.8 M |
dov | 3.5K | 923 | 129.8K | 56.7K | 2.6M | 967.5K | 20.7M | 8M | 2.6 M | 7.1 M |
ach | 2K | 915 | 63K | 40.1K | 1.6M | 890.9K | 9M | 4.7M | 1.6 M | 3.0 M |
xal | 71.8K | 913 | 498.5K | 30.8K | 8.5M | 449.8K | 64.7M | 3.2M | 1.5 M | 24.4 M |
cuk | 4.1K | 899 | 76.5K | 34.3K | 2M | 469.9K | 24.7M | 4.6M | 1.5 M | 6.1 M |
kos | 2.2K | 881 | 44.6K | 27.8K | 1.1M | 780.1K | 6.5M | 4.2M | 1.4 M | 2.2 M |
crs | 7.6K | 873 | 282.4K | 40.1K | 7.3M | 1.2M | 40.1M | 6.8M | 2.2 M | 13.2 M |
wo | 36.4K | 871 | 303.4K | 25.4K | 30.7M | 850.7K | 213.4M | 4.5M | 1.7 M | 59.9 M |
bts | 3.2K | 869 | 109.1K | 29.1K | 3.1M | 663.3K | 20.8M | 4.2M | 1.4 M | 6.2 M |
ubu | 2.2K | 846 | 113.5K | 47.5K | 2.3M | 996.4K | 15.9M | 6.7M | 1.9 M | 4.7 M |
gym | 1.5K | 820 | 73.7K | 49.6K | 1.6M | 1.1M | 10.3M | 6.9M | 2.0 M | 3.2 M |
ibb | 74.1K | 818 | 516.5K | 36.3K | 26.4M | 776.1K | 190.9M | 4.9M | 1.5 M | 56.0 M |
ape | 7K | 814 | 147K | 56.1K | 12.4M | 881.5K | 71M | 5.8M | 1.6 M | 18.8 M |
stq | 111.9K | 809 | 111.9K | 27.7K | 34.4M | 600.4K | 243.1M | 3.8M | 1.5 M | 82.5 M |
ang | 66.5K | 803 | 1.8M | 86.7K | 28.5M | 1.7M | 193M | 9.8M | 3.4 M | 67.1 M |
enq | 7.1K | 793 | 241.9K | 39.1K | 11M | 718.8K | 68.5M | 4.8M | 1.3 M | 18.8 M |
tsg | 353.8K | 789 | 353.8K | 17.9K | 158M | 588.9K | 1.1B | 3.8M | 1.0 M | 309.9 M |
shn | 889 | 788 | 46.4K | 46.2K | 383.8K | 378.5K | 5.7M | 5.7M | 2.6 M | 2.6 M |
kri | 39.1K | 786 | 271.2K | 38.8K | 12.6M | 995.2K | 86.4M | 5M | 1.6 M | 20.9 M |
kek | 3.2K | 782 | 70.4K | 38.4K | 1.8M | 709K | 13.6M | 4.4M | 1.4 M | 4.7 M |
rmc | 2.4K | 738 | 2.4K | 25.8K | 1.3M | 545.4K | 7.9M | 3.2M | 1.1 M | 2.9 M |
acf | 4.9K | 730 | 81.9K | 24.6K | 2.1M | 602.2K | 11.6M | 3M | 1.1 M | 4.7 M |
fip | 3.7K | 729 | 165.6K | 49K | 3.5M | 916.8K | 25.7M | 6.6M | 2.1 M | 8.6 M |
syr | 3.5K | 716 | 326.4K | 197.1K | 4.6M | 1.9M | 31.5M | 14M | 6.1 M | 13.9 M |
qub | 972 | 705 | 61K | 51.1K | 589.2K | 455.5K | 5.9M | 4.4M | 1.4 M | 1.8 M |
bm | 21.9K | 702 | 172.3K | 24.5K | 7.1M | 583.1K | 48.4M | 3M | 1.1 M | 14.4 M |
tzh | 1.7K | 702 | 41.7K | 33.9K | 1.5M | 929.6K | 9.3M | 5.6M | 1.6 M | 2.6 M |
jiv | 1.7K | 696 | 80.9K | 32K | 1.1M | 418.9K | 9.6M | 3.5M | 1.1 M | 3.3 M |
kn_Latn | 72.9K | 688 | 765.9K | 10.1K | 72.9K | 688 | 328.1K | 2.5K | 430.8 K | 61.4 M |
kjh | 1.5K | 672 | 42.8K | 28.7K | 566.1K | 379.2K | 4.5M | 3.1M | 1.3 M | 2.0 M |
yap | 1.9K | 638 | 37.6K | 19.5K | 1.3M | 661.4K | 6.9M | 3.3M | 1.0 M | 2.2 M |
ban | 8K | 637 | 150.9K | 16.3K | 5M | 499.7K | 35.4M | 3.6M | 1.1 M | 12.0 M |
tuc | 3.5K | 635 | 193.2K | 50.3K | 2.9M | 703K | 17.2M | 4.1M | 1.2 M | 5.7 M |
tcy | 10.7K | 632 | 338.7K | 37.1K | 5.5M | 432.6K | 41.6M | 3.3M | 1.7 M | 20.9 M |
cab | 1.2K | 629 | 50.4K | 37.5K | 1M | 690.9K | 7.5M | 5.1M | 1.6 M | 2.4 M |
cak | 1.2K | 617 | 70.4K | 32.6K | 1.3M | 730.1K | 7.6M | 4.2M | 1.3 M | 2.4 M |
din | 128.4K | 611 | 885.8K | 23.6K | 31.6M | 541.7K | 210M | 2.9M | 1.1 M | 64.3 M |
zh_Latn | 739.4K | 602 | 10.7M | 45.1K | 739.4K | 602 | 3.4M | 2.3K | 2.0 M | 969.9 M |
arn | 2.4K | 593 | 64.5K | 26.2K | 1.5M | 541.9K | 10.2M | 3.7M | 1.2 M | 3.7 M |
lrc | 42.4K | 587 | 351.9K | 9K | 17.3M | 248.9K | 85.3M | 1.4M | 646.9 K | 37.5 M |
rwo | 938 | 572 | 938 | 45.5K | 734.8K | 590.4K | 5.1M | 4.2M | 1.1 M | 1.4 M |
hus | 825 | 569 | 26.5K | 23.7K | 733.4K | 542.1K | 4.4M | 3.1M | 967.6 K | 1.3 M |
bum | 4.7K | 559 | 103.8K | 36.5K | 3M | 805.5K | 18.8M | 4M | 1.3 M | 6.1 M |
mak | 1K | 555 | 32.5K | 20.4K | 761K | 457.4K | 6.1M | 3.7M | 1.1 M | 2.0 M |
frp | 148K | 550 | 3.5M | 8.2K | 71.2M | 230.2K | 535.4M | 1.4M | 518.3 K | 129.7 M |
seh | 5.6K | 545 | 68.8K | 37.2K | 2M | 650.6K | 14.9M | 4.9M | 1.5 M | 4.4 M |
twu | 2.5K | 539 | 109.9K | 24.4K | 2.4M | 571.2K | 14.2M | 3.2M | 1.0 M | 4.8 M |
kmb | 1.3K | 538 | 60.4K | 36.9K | 1.4M | 810.8K | 8.4M | 4.6M | 1.4 M | 2.6 M |
ksw | 560 | 536 | 16.1K | 16K | 219.9K | 218.8K | 2.9M | 2.9M | 1.4 M | 1.4 M |
sja | 1.3K | 527 | 67.7K | 24.9K | 982.5K | 459.3K | 7.7M | 3.4M | 1.1 M | 2.6 M |
amu | 1.8K | 511 | 72K | 25.2K | 1.5M | 443.3K | 9.6M | 3.2M | 1.0 M | 3.4 M |
mad | 103.8K | 509 | 500.6K | 18.5K | 16.2M | 386.7K | 111.8M | 2.8M | 960.3 K | 34.2 M |
quh | 1K | 501 | 42K | 29.9K | 624.4K | 396.8K | 5.8M | 3.7M | 1.2 M | 1.8 M |
dyu | 1.2K | 483 | 55.8K | 19.7K | 1.2M | 421.8K | 5.7M | 2M | 665.5 K | 1.9 M |
toj | 736 | 452 | 736 | 26.1K | 691.2K | 540.2K | 4.3M | 3.3M | 1.0 M | 1.3 M |
ch | 12.9K | 449 | 147.5K | 16K | 8.9M | 393.9K | 63.5M | 2.5M | 906.8 K | 10.0 M |
sus | 664 | 437 | 664 | 15.2K | 648K | 402.8K | 3.7M | 2.1M | 674.0 K | 1.0 M |
nog | 970 | 419 | 970 | 11K | 330.3K | 200.4K | 2.6M | 1.6M | 714.0 K | 1.2 M |
jam | 12.7K | 416 | 68.5K | 15.8K | 3.5M | 378.4K | 25.8M | 1.7M | 609.5 K | 7.6 M |
gui | 1.1K | 409 | 62.7K | 24.8K | 915K | 314K | 6.5M | 2M | 619.3 K | 2.1 M |
nia | 2K | 408 | 2K | 25K | 1.7M | 476.5K | 11.3M | 3.1M | 1.0 M | 3.9 M |
mas | 15.2K | 405 | 216.8K | 17.6K | 6.2M | 390.1K | 42.1M | 3M | 927.5 K | 13.4 M |
bzj | 983 | 404 | 33.6K | 26.4K | 824.3K | 565K | 4.5M | 2.9M | 981.2 K | 1.4 M |
mkn | 956 | 402 | 33.1K | 25.4K | 584.2K | 456.9K | 3.4M | 2.6M | 734.8 K | 1.0 M |
lhu | 46K | 377 | 975K | 15.7K | 29.1M | 441.2K | 208.6M | 2.5M | 623.0 K | 38.8 M |
ctu | 690 | 366 | 35.5K | 20.6K | 646.7K | 352.8K | 3.6M | 2M | 614.9 K | 1.2 M |
kg | 4.7K | 365 | 85.5K | 21.7K | 2.5M | 406.7K | 16.6M | 2.6M | 905.4 K | 5.7 M |
inb | 387 | 343 | 17.3K | 17K | 202.8K | 197K | 2M | 1.9M | 535.2 K | 555.6 K |
guh | 1.9K | 331 | 104.9K | 28.4K | 1.5M | 328.4K | 11.2M | 3M | 789.5 K | 3.5 M |
rn | 8.2K | 323 | 8.2K | 11.1K | 4.5M | 179K | 33.2M | 1.3M | 449.9 K | 11.8 M |
bus | 467 | 322 | 21.4K | 12.1K | 418.4K | 219.2K | 2.1M | 1.1M | 428.8 K | 830.9 K |
mfe | 7.5K | 320 | 198.8K | 18.2K | 4.6M | 374.8K | 26.9M | 2.1M | 716.4 K | 10.1 M |
sda | 1.6K | 317 | 43.2K | 6.2K | 2.5M | 218.3K | 15.8M | 1.6M | 529.0 K | 4.7 M |
bi | 71.9K | 311 | 308.5K | 13.6K | 19.4M | 359.4K | 132.4M | 1.9M | 546.9 K | 42.6 M |
cr_Latn | 19K | 303 | 170K | 8.9K | 19K | 303 | 81.8K | 1K | 590.4 K | 15.0 M |
gor | 1.7K | 303 | 53.3K | 6.5K | 1.4M | 227.1K | 9.4M | 1.7M | 494.0 K | 3.1 M |
jac | 8.2K | 303 | 61.6K | 11.9K | 1.8M | 271K | 15.7M | 1.7M | 530.3 K | 7.3 M |
chr | 964 | 301 | 33.8K | 7.5K | 629.9K | 172.3K | 4.7M | 1M | 564.1 K | 2.1 M |
mh | 4.6K | 296 | 235.1K | 13K | 3.6M | 393.5K | 24.9M | 2.2M | 778.4 K | 8.4 M |
mni | 1.2K | 290 | 38.1K | 13.2K | 841.3K | 245.5K | 6.4M | 1.8M | 866.6 K | 3.0 M |
wal | 2.6K | 286 | 128K | 14K | 2M | 203.4K | 17M | 1.7M | 525.7 K | 5.1 M |
teo | 2.8K | 274 | 131.5K | 13.7K | 2.3M | 221.4K | 15.3M | 1.6M | 564.9 K | 5.3 M |
gub | 31.7K | 271 | 160.4K | 25K | 4.7M | 286.2K | 44.7M | 1.6M | 431.3 K | 23.1 M |
qvi | 1.2K | 266 | 48.4K | 19.3K | 720.4K | 248.9K | 6.5M | 2.3M | 641.2 K | 1.9 M |
tdx | 1.7K | 262 | 26.3K | 13.2K | 1M | 238.5K | 7M | 1.6M | 503.6 K | 2.1 M |
rki | 331 | 251 | 331 | 7.8K | 119.7K | 113.7K | 1.6M | 1.5M | 751.3 K | 781.8 K |
djk | 560 | 246 | 30.9K | 24.4K | 669.5K | 455.6K | 3.7M | 2.2M | 644.3 K | 1.0 M |
nr | 10.7K | 246 | 10.7K | 11.3K | 5.3M | 162.5K | 49M | 1.5M | 519.7 K | 17.8 M |
zne | 1.3K | 239 | 61.9K | 21.3K | 1.4M | 504.6K | 8.2M | 2.8M | 882.3 K | 2.8 M |
izz | 423 | 237 | 21.7K | 14.5K | 382.8K | 194.5K | 2.1M | 1.1M | 382.2 K | 789.9 K |
noa | 902 | 234 | 902 | 11.5K | 821.1K | 243.9K | 5.2M | 1.6M | 534.3 K | 1.7 M |
bqc | 275 | 228 | 9.8K | 8.2K | 193K | 151.7K | 997K | 788.4K | 317.0 K | 408.1 K |
srm | 847 | 227 | 847 | 17.3K | 1.2M | 445.3K | 6.3M | 2M | 613.4 K | 1.7 M |
niq | 26.7K | 226 | 26.7K | 4.2K | 9.9M | 103.4K | 72.1M | 716.2K | 239.1 K | 20.9 M |
bas | 4.2K | 216 | 105.2K | 14.9K | 4.3M | 362.8K | 25.7M | 1.7M | 600.7 K | 7.6 M |
dwr | 452 | 215 | 22.1K | 11.1K | 269.4K | 139.5K | 2.2M | 1.2M | 375.4 K | 747.6 K |
guc | 537 | 214 | 22.9K | 12.5K | 422.4K | 218.1K | 3.4M | 1.8M | 540.1 K | 1.1 M |
jvn | 1K | 213 | 36.2K | 7.8K | 790.5K | 185.6K | 5.3M | 1.2M | 357.2 K | 1.7 M |
hvn | 737 | 200 | 33.9K | 7K | 779.7K | 239.4K | 4.3M | 1.2M | 378.5 K | 1.4 M |
sxn | 587 | 197 | 587 | 9.9K | 494K | 220.6K | 3.4M | 1.5M | 507.1 K | 1.2 M |
koi | 20.7K | 196 | 153.9K | 5K | 2.2M | 89.9K | 17.1M | 664.5K | 323.0 K | 7.1 M |
alz | 2.2K | 195 | 59.3K | 12.2K | 1.3M | 246.9K | 7.9M | 1.4M | 488.1 K | 2.9 M |
nyu | 1.2K | 195 | 1.2K | 11K | 988.7K | 210.5K | 7.7M | 1.6M | 492.6 K | 2.2 M |
bn_Latn | 98.7K | 191 | 1.3M | 12K | 98.7K | 191 | 458K | 730 | 314.7 K | 81.0 M |
suz | 226 | 186 | 226 | 11.3K | 169.6K | 140.5K | 1M | 855.2K | 339.5 K | 429.6 K |
pau | 1.7K | 185 | 1.7K | 13.1K | 2M | 394.6K | 12.4M | 2M | 600.1 K | 3.2 M |
nij | 1K | 183 | 1K | 9.2K | 741.6K | 186.1K | 4.7M | 1.2M | 389.6 K | 1.6 M |
sat_Latn | 39K | 183 | 39K | 5.5K | 39K | 183 | 183.8K | 601 | 276.1 K | 39.2 M |
gu_Latn | 58.2K | 179 | 688.4K | 5.4K | 58.2K | 179 | 260.8K | 673 | 241.0 K | 47.9 M |
msm | 520 | 177 | 520 | 8.6K | 410.8K | 190.5K | 2.5M | 1.1M | 339.7 K | 789.8 K |
maz | 585 | 170 | 21.3K | 8.2K | 452.9K | 174K | 2.9M | 951.7K | 304.7 K | 971.4 K |
qxr | 2.6K | 153 | 40.8K | 6.4K | 761.5K | 75.4K | 6.6M | 724K | 186.4 K | 1.9 M |
shp | 874 | 150 | 22.4K | 3.7K | 534.1K | 96.8K | 3.8M | 710.4K | 216.9 K | 1.2 M |
hne | 3K | 146 | 118.4K | 4.3K | 2.3M | 139.3K | 12M | 697K | 379.3 K | 6.5 M |
ktu | 3.3K | 144 | 115.5K | 7.8K | 3.2M | 196.9K | 18.5M | 1.1M | 300.1 K | 5.4 M |
laj | 6.5K | 144 | 61K | 6.4K | 2.4M | 140.1K | 15.8M | 730.5K | 233.5 K | 4.6 M |
pis | 1.1K | 139 | 62K | 7.2K | 1.3M | 136.8K | 7.7M | 764K | 212.7 K | 2.2 M |
mag | 631 | 138 | 62.6K | 22.1K | 2.1M | 544.2K | 10.7M | 2.6M | 1.4 M | 5.4 M |
gbm | 2.5K | 137 | 50.8K | 3.8K | 1.7M | 99.7K | 9.1M | 499.6K | 282.4 K | 4.5 M |
tzj | 471 | 136 | 11.1K | 7.3K | 299.9K | 150.8K | 1.9M | 884.2K | 272.0 K | 663.9 K |
oj | 2.5K | 135 | 2.5K | 1.6K | 1.2M | 35.9K | 9.6M | 337.1K | 117.6 K | 3.4 M |
ndc_ZW | 2.2K | 132 | 2.2K | 8.7K | 2.2K | 132 | 9.1K | 523 | 343.1 K | 2.2 M |
tks | 63.7K | 127 | 63.7K | 6.8K | 17.1M | 41.5K | 88.9M | 260.8K | 39.5 K | 33.0 M |
awa | 5.8K | 126 | 100.1K | 8.4K | 2.2M | 98.7K | 11.1M | 475K | 226.6 K | 5.8 M |
gvl | 37.9K | 126 | 213K | 6.9K | 21.1M | 161.1K | 141M | 789.2K | 257.8 K | 31.7 M |
knj | 229 | 126 | 10.1K | 9.2K | 202.6K | 171.8K | 1.1M | 855K | 253.1 K | 345.4 K |
spp | 733 | 123 | 733 | 5.8K | 902.7K | 141.8K | 4.4M | 682.5K | 217.8 K | 1.4 M |
mqy | 69.3K | 119 | 309K | 2.5K | 12.1M | 88.6K | 78.9M | 506.5K | 170.4 K | 16.3 M |
tca | 410 | 117 | 20K | 7.3K | 283K | 121.5K | 2.3M | 786K | 226.2 K | 781.2 K |
cce | 847 | 116 | 23.2K | 11K | 539.3K | 227.2K | 3.3M | 1.3M | 393.8 K | 1.1 M |
skr | 3.8K | 107 | 279.3K | 17.1K | 6.2M | 324K | 32.2M | 1.7M | 768.5 K | 15.4 M |
kmz_Latn | 24K | 106 | 361K | 2.4K | 24K | 106 | 108.6K | 401 | 231.8 K | 16.7 M |
dje | 913 | 100 | 40.2K | 3.7K | 816.3K | 97.5K | 4.7M | 480.7K | 161.2 K | 1.5 M |
gof | 2.8K | 97 | 33.8K | 5.5K | 703K | 68.8K | 5.5M | 506K | 159.1 K | 1.7 M |
agr | 465 | 93 | 16.1K | 3.6K | 295.4K | 67.2K | 2.3M | 554.5K | 177.0 K | 760.1 K |
qvz | 534 | 88 | 6.8K | 3.5K | 145.5K | 50.5K | 1.2M | 438.3K | 124.2 K | 382.7 K |
adh | 2.6K | 87 | 107.2K | 1K | 2.4M | 42.1K | 14.5M | 254.9K | 84.6 K | 5.0 M |
quf | 522 | 86 | 8.4K | 5.2K | 155.7K | 61.8K | 1.5M | 609K | 173.7 K | 542.8 K |
kjg | 113 | 84 | 3K | 2.9K | 67.6K | 67K | 408.5K | 399K | 159.2 K | 167.7 K |
tsc | 12.6K | 82 | 12.6K | 4K | 3.5M | 93.1K | 23.4M | 521.3K | 161.9 K | 7.0 M |
ber | 2.7K | 79 | 12.6K | 1.2K | 1.1M | 46.4K | 6.4M | 265.9K | 141.5 K | 3.0 M |
ify | 611 | 79 | 19.8K | 2.8K | 422.7K | 56.2K | 2.6M | 334K | 109.5 K | 913.1 K |
cbk | 10.1K | 78 | 43.8K | 2K | 1.7M | 64.3K | 10.3M | 339.3K | 93.4 K | 3.4 M |
quy | 588 | 78 | 28.1K | 2.7K | 423.3K | 37.3K | 4.5M | 368.2K | 114.5 K | 1.2 M |
ahk | 244 | 77 | 6.2K | 4.1K | 264K | 124.8K | 1.3M | 715.5K | 182.8 K | 359.7 K |
cac | 212 | 77 | 3.4K | 1.8K | 125.7K | 54.1K | 978.7K | 319.8K | 95.8 K | 280.3 K |
akb | 1K | 71 | 21.3K | 408 | 870.9K | 54.5K | 5.2M | 337.8K | 93.7 K | 1.6 M |
nut | 29K | 67 | 29K | 1.5K | 4.8M | 39.8K | 23.5M | 184.1K | 36.4 K | 8.3 M |
ffm | 1.8K | 65 | 30.1K | 2K | 745.6K | 39.1K | 4.6M | 236.1K | 83.8 K | 1.8 M |
taj | 146 | 65 | 21.6K | 14.3K | 309.7K | 203K | 2.3M | 1.4M | 503.0 K | 872.7 K |
ms_Arab | 698 | 63 | 698 | 320 | 698 | 63 | 2.9K | 239 | 64.7 K | 1016.0 K |
brx | 322 | 62 | 5.3K | 2.4K | 144.2K | 41K | 1.1M | 304.4K | 146.6 K | 515.7 K |
ann | 464 | 56 | 5K | 1.6K | 116.4K | 35.9K | 760.9K | 215.1K | 74.9 K | 295.2 K |
qup | 169 | 53 | 4.3K | 2.5K | 77.5K | 31.3K | 763.8K | 297.8K | 74.7 K | 207.3 K |
ms_Arab_BN | 2.6K | 46 | 2.6K | 374 | 2.6K | 46 | 10.5K | 171 | 50.0 K | 5.1 M |
miq | 236 | 45 | 6.4K | 3.5K | 183.7K | 80.2K | 1.2M | 485.6K | 157.6 K | 384.1 K |
msb | 811 | 41 | 811 | 1K | 705.9K | 28.8K | 4.4M | 167.5K | 53.3 K | 1.7 M |
bim | 410 | 40 | 31.1K | 6.3K | 669.8K | 167.4K | 3.2M | 793.4K | 252.7 K | 1.1 M |
raj | 1.8K | 40 | 1.8K | 5.7K | 1.3M | 81.1K | 7.1M | 405K | 226.2 K | 3.9 M |
kwi | 382 | 37 | 16.9K | 2.2K | 253.8K | 23.4K | 1.8M | 172.8K | 47.6 K | 536.2 K |
tll | 200 | 37 | 200 | 2.7K | 304.2K | 62.2K | 2.2M | 409.8K | 132.3 K | 664.5 K |
trp | 12.8K | 36 | 12.8K | 1.7K | 4.1M | 39K | 29.9M | 257.3K | 87.5 K | 10.2 M |
smt | 1.4K | 34 | 1.4K | 703 | 1M | 36.5K | 6.8M | 245.4K | 87.9 K | 2.5 M |
mrw | 11.3K | 29 | 11.3K | 1K | 4.2M | 45.7K | 27.8M | 257.2K | 81.3 K | 8.8 M |
dln | 236 | 28 | 5.2K | 969 | 150.8K | 21.5K | 860.5K | 118.3K | 36.8 K | 280.3 K |
qvc | 3.4K | 27 | 14.6K | 2.2K | 495.7K | 25.7K | 5M | 233.7K | 65.3 K | 2.6 M |
doi | 1.7K | 26 | 21.8K | 975 | 568.7K | 25.5K | 3.2M | 135.3K | 66.7 K | 1.6 M |
ff | 13.6K | 26 | 150K | 5K | 3.4M | 46.5K | 22.8M | 277.6K | 78.8 K | 8.5 M |
## Citation Information
~~~
@misc{kudugunta2023madlad400,
title={MADLAD-400: A Multilingual And Document-Level Large Audited Dataset},
author={Sneha Kudugunta and Isaac Caswell and Biao Zhang and Xavier Garcia and Christopher A. Choquette-Choo and Katherine Lee and Derrick Xin and Aditya Kusupati and Romi Stella and Ankur Bapna and Orhan Firat},
year={2023},
eprint={2309.04662},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
~~~
|
allenai/MADLAD-400
|
[
"task_categories:text-generation",
"size_categories:n>1T",
"license:odc-by",
"arxiv:2309.04662",
"arxiv:2010.14571",
"arxiv:2103.12028",
"region:us"
] |
2023-08-31T23:06:27+00:00
|
{"license": "odc-by", "size_categories": ["n>1T"], "task_categories": ["text-generation"]}
|
2023-10-31T20:47:56+00:00
|
[
"2309.04662",
"2010.14571",
"2103.12028"
] |
[] |
TAGS
#task_categories-text-generation #size_categories-n>1T #license-odc-by #arxiv-2309.04662 #arxiv-2010.14571 #arxiv-2103.12028 #region-us
|
MADLAD-400
==========
Dataset and Introduction
------------------------
MADLAD-400 (*Multilingual Audited Dataset: Low-resource And Document-level*) is
a document-level multilingual dataset based on Common Crawl, covering 419
languages in total. This uses all snapshots of CommonCrawl available as of August
1, 2022. The primary advantage of this dataset over similar datasets is that it
is more multilingual (419 languages), it is audited and more highly filtered,
and it is document-level. The main disadvantage is also its strength -- being
more filtered, it may lack the recall needed for some applications.
There are two versions released: the noisy dataset, which has no filtering
except document-level LangID, and the clean dataset, which has a variety of
filters applied, though it naturally has a fair amount of noise itself. Each
dataset is released in a document-level form that has been deduplicated.
Loading
-------
You can load both the clean and noisy versions of any language by specifing its LangID:
```
madlad_abt = load_dataset("allenai/madlad-400", "abt")
```
A list of langagues can also be supplied with a keyword argument:
```
madlad_multilang = load_dataset("allenai/madlad-400", languages=["abt", "ace"])
```
Additionally, you can load the noisy and clean subsets seperately with the split keyword argument:
```
madlad_multilang_clean = load_dataset("allenai/madlad-400", languages=["abt", "ace"], split="clean")
```
LangID model and Crawl
----------------------
Following Language Id In the Wild, we
trained a Semi-Supervised LangId model (SSLID) on 500 languages. The training
data is as described in that paper, with the differences that 1) training data
is sampled to a temperature of 'T=3' to reduce over-triggering on low-resource
languages; and 2) the data is supplemented with web-crawled data from the same
paper (that has already been through the various filters described therein) in
the hopes that it will increase robustness to web-domain text.
Filtering
---------
Before separating the raw CommonCrawl corpus by LangID, these
filtering steps are done, similar to Raffel et al (2020):
* Discarded any page with fewer than 5 sentences and only retained lines that
contained at least 3 words.
* Removed any line with the word Javascript.
* Removed any page where the phrase “lorem ipsum” appeared.
* Removed any pages containing the phrases "terms of use", "privacy policy",
"cookie policy", "uses cookies", "use of cookies", "use cookies"
* Removed any pages that contained a curly bracket.
* To deduplicate the data set, discarded all but one of any three-sentence span occurring more than once in the data set.
The 'noisy' subset of the data was filtered only by document-level LangID, which
was taken to be the majority sentence-level LangID prediction. The 'clean'
subset removed all documents with a 'percent\_questionable' score greater than
20%. It furthermore removed any document with under 5 sentences.
The 'pct\_questionable' score is simple the percentage of sentences in the input
document that were "questionable". A sentence was considered questionable if any
of the following were true:
* LangID Consistency: the sentence-level LangID does not match the
document-level LangID
* List Case: The sentence has at least 12 tokens, and over 50% percent of
the tokens began in a capital letter.
* Length: The sentence has under 20 characters or over 500 characters
(note: this is a bad heuristic for ideographic languages)
* Danger Chars: Over 20% of the characters in the sentence match
'[0-9{}+/()>]'
* Cursedness: The sentence matches a cursed regex (see below)
### Cursed Substrings
Based on the initial round of data audits, the authors created a heuristic list of
substrings and regexes accounting for a large amount of questionable content.
Keep in mind that these all are fed into the 'pct\_questionable' score -- a
sentence is only excluded from the 'clean' dataset if over 20% of the sentences
in that document are flagged as questionable.
notes about cursed substrings:
* low quality sentences ending in the pipe character were very common. Before
you ask, this was not Devanagari-script text using a Danda.
* The last few regexes are meant to match 'A N T S P E A K', 'List Case', and
weirdly regular text (for instance, lists of shipping labels or country
codes)
### Virama Correction
Many languages using Brahmic Abugida (South and Southeast Asian scripts like
Devanagari, Khmer, etc.) use some variant on the virama character. For whatever
reason, it was found that this character was often messed up in the common crawl
snapshots used. Therefore, for the languages 'bn my pa gu or ta te kn ml
si th tl mn lo bo km hi mr ne gom as jv dv bho dz hne ks\_Deva mag mni shn yue zh
ja kjg mnw ksw rki mtr mwr xnr', a special correction step was done.
For these languages, the authors took the list of all virama characters and removed all
unnecessary spaces between each instance of a virama character and the next
character with a regex.
### Myanmar Font Compatibility
Prior to 2019, the most popular font for Burmese websites was the Zawgyi font.
The authors used Myanmar Tools to convert text.
Several scripts, like the Chinese script, Tibetan script, and Thai, do not use
whitespace to separate characters. The languages with this property in this
dataset are 'yue zh ja th lo kjg mnw my shn ksw rki km bo dz'.
Alas, the Length aspect of the 'pct\_questionable' score was calculated using
simplistic whitespace tokenization, and therefore rendered the whole
'pct\_questionable' score invalid for those languages. Therefore, for these
languages, the "clean" data is identical to the "noisy" data (barring Chinese;
see below.)
### Special filters
Chinese had a particular issue with pornographic content. After manual inspection
a list of strings likely to be present in pornographic content was developed. All
pages containing at least one of these strings were removed. Resulted in 17%
reduction in number of documents and 56% reduction in file size.
A few more random notes, comparing to common alternative codes for these
languages:
* 'fil' for Filipino/Tagalog, not 'tl'
* 'ak' for Twi/Akan, rather than 'tw'. This includes Fante.
* Unfortunately use the macro code 'chm' for Meadow Mari (instead of the
correct 'mhr'), and 'mrj' for Hill Mari
* 'no' for Norwegian Bokmål, whereas some resources use
'nb'
* 'ps' for Pashto instead of 'pbt' (Southern Pashto)
* 'ms' for Standard Malay, not 'zlm'
* 'sq' for Albanian, and don't distinguish dialects like
Gheg ('aln') and Tosk ('als')
* 'ber' as the code for Tamazight, after consultation with Tamazight
speakers opining that the dialect distinctions are not significant. Other
resources use the individual codes like 'tzm' and 'kab'.
* Macrocode 'qu' for Quechua. In practice, this seems usually to be
a mix of the Ayacucho and Cusco dialects. Other resources, like NLLB, may
use the dialect code, e.g. 'quy' for Ayacucho Chanka. The same is true for a
few other macro codes, like 'ff' (Macro code for Fulfulde, whereas other
sources may use e.g. 'fuv'.)
* Really, there are notes that can be made about almost any code, from the
well-accepted conventions like 'zh' for Mandarin, to many dialectical notes,
like which variant of Hmong really is the 'hmn' data? But the above ones are
made specifically for ones where the authors are aware of other datasources floating
out there that use different conventions.
Audit
-----
Following Quality at a Glance, the authors performed
an "audit" of every corpus in this dataset. Although the authors did not speak most
languages, they were able to give high-level comments on the general quality. They
looked at a sample of 20 documents of each language.
After an initial round of auditing, they devised a new set of filters and applied
them. They then re-did all audits.
### Overall notes from the audit
The decision was to include languages that looked noisy, but omit any language
that was clearly majority noise, or only had 20 or fewer docs. This is a low
bar -- twenty documents can be very little indeed, and some of the corpora released are quite noisy, but all of them should have at least the potential to
be used in some useful way. The motivation for not releasing nonsense or tiny
datasets is to not give a false sense of how multilingual this dataset actually
is ("Representation washing"), as recommended by Quality at a Glance.
A few overarching points:
* Many low-resource languages only had Bible text, or in some cases URL
data. These are marked in the rows below. Generally 'ok bible' means that
100% of the audited sentences were Biblical, whereas if 'bible' is simply
mentioned in the note, it was not the only source of data.
* Indian languages in the Latin script had a high concentration of
pornographic content.
### Renames and Merges as a result of the Audit
In several cases, it was clear from the audit that the corpora were not in the
languages that the LangID model claimed they were. This led to the following
renames:
* dty renamed to 'zxx-xx-dtynoise', aka a "language" of noise. This is mainly
mis-rendered PDFs and may have some practical applications for decoding
said.
* 'fan' renamed to 'bum'
* 'ss-SZ' renamed to 'ss' -- this was just a result of us having inconsistent
data labels.
* 'cjk' merged into the 'gil' dataset
* 'bjj' merged into the 'awa' dataset
Canaries
--------
Canaries are provided in separate 'canaries' folder. Canaries are organized into three directions: 'monolingual' hosts canaries designed for the MADLAD-400 monody data, 'multiway' for the multiway data, and 'generic' the generic canaries generated only from the model's vocabulary.
* Monolingual: Canaries here are organized by the language the canary was generated from. This corresponds exactly to the 'translate\_copy' setting in the paper, where the source and target language match.
* Multiway: Canaries here are organized in one of two fashions. 'to\_XX' indicates canaries organized by the target language (and where the source language could be any language). 'XX-XX' indicates the canaries (interleaved\_both and interleaved\_mislabeled\_both) designed for a specific pair of languages.
Within each subdirectory above, canaries are into separate files named by the canary type. There is always only a single file for each canary type. The 'generic' folder contains within it the four canary types.
Canaries can be mixed in with normal training data to then be analyzed post-hoc to training
References
----------
Raffel, Colin, et al. "Exploring the limits of transfer learning with a unified
text-to-text transformer." J. Mach. Learn. Res. 21.140 (2020): 1-67.
Contact
-------
Please reach out to {snehakudugunta, icaswell}꩜URL. For questions about the canaries, reach out to cchoquette@URL
License
-------
This data is released with the 'CC-BY-4.0' license.
Detailed notes from the audit
-----------------------------
Here are the notes on all languages, along with the number of documents
found, and the final decision made with respect to including the language in
this dataset.
A few comments too long to fit in the table above:
* 'alt': WAIT THIS IS AMAZING IT IS ACTUALLY ALTAI! e.g. from urls like
URL
* 'tly-IR': They all look like boilerplate content, e.g., list of
keywords/search queries used to bump page ranking in search results. Not any
useful material for translation. Remove.
* 'zap': pls note that at least some Zapotec speakers tend to view it as one
language, not as a million dialects like ISO does. However, some are
certainly mutually unintelligible, complicating the matter.
* 'zh-Latn': The biggest problem is that several examples are not in Latin
Chinese (i.e., romanization in my understanding) but in English or mixed
English and Chinese. For those data in Latin Chinese, their quality seems to
be good.
* 'zh': Many examples are porn-related, particularly those very long
documents. Also, there are some examples of traditional Chinese.
Final Dataset information
-------------------------
The number of documents, sentences, tokens, characters, and bytes for the noisy
and clean splits of the data. Note that the "toks" field below uses whitespace
for tokenization, so is not appropriate for non-whitespace-separating languages
like Chinese (see section above). Note that the english subset in this version
is missing 18% of documents that were included in the published analysis of the dataset.
These documents will be incoporated in an update coming soon.
```
@misc{kudugunta2023madlad400,
title={MADLAD-400: A Multilingual And Document-Level Large Audited Dataset},
author={Sneha Kudugunta and Isaac Caswell and Biao Zhang and Xavier Garcia and Christopher A. Choquette-Choo and Katherine Lee and Derrick Xin and Aditya Kusupati and Romi Stella and Ankur Bapna and Orhan Firat},
year={2023},
eprint={2309.04662},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
[
"### Cursed Substrings\n\n\nBased on the initial round of data audits, the authors created a heuristic list of\nsubstrings and regexes accounting for a large amount of questionable content.\nKeep in mind that these all are fed into the 'pct\\_questionable' score -- a\nsentence is only excluded from the 'clean' dataset if over 20% of the sentences\nin that document are flagged as questionable.\n\n\nnotes about cursed substrings:\n\n\n* low quality sentences ending in the pipe character were very common. Before\nyou ask, this was not Devanagari-script text using a Danda.\n* The last few regexes are meant to match 'A N T S P E A K', 'List Case', and\nweirdly regular text (for instance, lists of shipping labels or country\ncodes)",
"### Virama Correction\n\n\nMany languages using Brahmic Abugida (South and Southeast Asian scripts like\nDevanagari, Khmer, etc.) use some variant on the virama character. For whatever\nreason, it was found that this character was often messed up in the common crawl\nsnapshots used. Therefore, for the languages 'bn my pa gu or ta te kn ml\nsi th tl mn lo bo km hi mr ne gom as jv dv bho dz hne ks\\_Deva mag mni shn yue zh\nja kjg mnw ksw rki mtr mwr xnr', a special correction step was done.\n\n\nFor these languages, the authors took the list of all virama characters and removed all\nunnecessary spaces between each instance of a virama character and the next\ncharacter with a regex.",
"### Myanmar Font Compatibility\n\n\nPrior to 2019, the most popular font for Burmese websites was the Zawgyi font.\nThe authors used Myanmar Tools to convert text.\n\n\nSeveral scripts, like the Chinese script, Tibetan script, and Thai, do not use\nwhitespace to separate characters. The languages with this property in this\ndataset are 'yue zh ja th lo kjg mnw my shn ksw rki km bo dz'.\n\n\nAlas, the Length aspect of the 'pct\\_questionable' score was calculated using\nsimplistic whitespace tokenization, and therefore rendered the whole\n'pct\\_questionable' score invalid for those languages. Therefore, for these\nlanguages, the \"clean\" data is identical to the \"noisy\" data (barring Chinese;\nsee below.)",
"### Special filters\n\n\nChinese had a particular issue with pornographic content. After manual inspection\na list of strings likely to be present in pornographic content was developed. All\npages containing at least one of these strings were removed. Resulted in 17%\nreduction in number of documents and 56% reduction in file size.\n\n\nA few more random notes, comparing to common alternative codes for these\nlanguages:\n\n\n* 'fil' for Filipino/Tagalog, not 'tl'\n* 'ak' for Twi/Akan, rather than 'tw'. This includes Fante.\n* Unfortunately use the macro code 'chm' for Meadow Mari (instead of the\ncorrect 'mhr'), and 'mrj' for Hill Mari\n* 'no' for Norwegian Bokmål, whereas some resources use\n'nb'\n* 'ps' for Pashto instead of 'pbt' (Southern Pashto)\n* 'ms' for Standard Malay, not 'zlm'\n* 'sq' for Albanian, and don't distinguish dialects like\nGheg ('aln') and Tosk ('als')\n* 'ber' as the code for Tamazight, after consultation with Tamazight\nspeakers opining that the dialect distinctions are not significant. Other\nresources use the individual codes like 'tzm' and 'kab'.\n* Macrocode 'qu' for Quechua. In practice, this seems usually to be\na mix of the Ayacucho and Cusco dialects. Other resources, like NLLB, may\nuse the dialect code, e.g. 'quy' for Ayacucho Chanka. The same is true for a\nfew other macro codes, like 'ff' (Macro code for Fulfulde, whereas other\nsources may use e.g. 'fuv'.)\n* Really, there are notes that can be made about almost any code, from the\nwell-accepted conventions like 'zh' for Mandarin, to many dialectical notes,\nlike which variant of Hmong really is the 'hmn' data? But the above ones are\nmade specifically for ones where the authors are aware of other datasources floating\nout there that use different conventions.\n\n\nAudit\n-----\n\n\nFollowing Quality at a Glance, the authors performed\nan \"audit\" of every corpus in this dataset. Although the authors did not speak most\nlanguages, they were able to give high-level comments on the general quality. They\nlooked at a sample of 20 documents of each language.\n\n\nAfter an initial round of auditing, they devised a new set of filters and applied\nthem. They then re-did all audits.",
"### Overall notes from the audit\n\n\nThe decision was to include languages that looked noisy, but omit any language\nthat was clearly majority noise, or only had 20 or fewer docs. This is a low\nbar -- twenty documents can be very little indeed, and some of the corpora released are quite noisy, but all of them should have at least the potential to\nbe used in some useful way. The motivation for not releasing nonsense or tiny\ndatasets is to not give a false sense of how multilingual this dataset actually\nis (\"Representation washing\"), as recommended by Quality at a Glance.\n\n\nA few overarching points:\n\n\n* Many low-resource languages only had Bible text, or in some cases URL\ndata. These are marked in the rows below. Generally 'ok bible' means that\n100% of the audited sentences were Biblical, whereas if 'bible' is simply\nmentioned in the note, it was not the only source of data.\n* Indian languages in the Latin script had a high concentration of\npornographic content.",
"### Renames and Merges as a result of the Audit\n\n\nIn several cases, it was clear from the audit that the corpora were not in the\nlanguages that the LangID model claimed they were. This led to the following\nrenames:\n\n\n* dty renamed to 'zxx-xx-dtynoise', aka a \"language\" of noise. This is mainly\nmis-rendered PDFs and may have some practical applications for decoding\nsaid.\n* 'fan' renamed to 'bum'\n* 'ss-SZ' renamed to 'ss' -- this was just a result of us having inconsistent\ndata labels.\n* 'cjk' merged into the 'gil' dataset\n* 'bjj' merged into the 'awa' dataset\n\n\nCanaries\n--------\n\n\nCanaries are provided in separate 'canaries' folder. Canaries are organized into three directions: 'monolingual' hosts canaries designed for the MADLAD-400 monody data, 'multiway' for the multiway data, and 'generic' the generic canaries generated only from the model's vocabulary.\n\n\n* Monolingual: Canaries here are organized by the language the canary was generated from. This corresponds exactly to the 'translate\\_copy' setting in the paper, where the source and target language match.\n* Multiway: Canaries here are organized in one of two fashions. 'to\\_XX' indicates canaries organized by the target language (and where the source language could be any language). 'XX-XX' indicates the canaries (interleaved\\_both and interleaved\\_mislabeled\\_both) designed for a specific pair of languages.\n\n\nWithin each subdirectory above, canaries are into separate files named by the canary type. There is always only a single file for each canary type. The 'generic' folder contains within it the four canary types.\n\n\nCanaries can be mixed in with normal training data to then be analyzed post-hoc to training\n\n\nReferences\n----------\n\n\nRaffel, Colin, et al. \"Exploring the limits of transfer learning with a unified\ntext-to-text transformer.\" J. Mach. Learn. Res. 21.140 (2020): 1-67.\n\n\nContact\n-------\n\n\nPlease reach out to {snehakudugunta, icaswell}꩜URL. For questions about the canaries, reach out to cchoquette@URL\n\n\nLicense\n-------\n\n\nThis data is released with the 'CC-BY-4.0' license.\n\n\nDetailed notes from the audit\n-----------------------------\n\n\nHere are the notes on all languages, along with the number of documents\nfound, and the final decision made with respect to including the language in\nthis dataset.\n\n\n\nA few comments too long to fit in the table above:\n\n\n* 'alt': WAIT THIS IS AMAZING IT IS ACTUALLY ALTAI! e.g. from urls like\nURL\n* 'tly-IR': They all look like boilerplate content, e.g., list of\nkeywords/search queries used to bump page ranking in search results. Not any\nuseful material for translation. Remove.\n* 'zap': pls note that at least some Zapotec speakers tend to view it as one\nlanguage, not as a million dialects like ISO does. However, some are\ncertainly mutually unintelligible, complicating the matter.\n* 'zh-Latn': The biggest problem is that several examples are not in Latin\nChinese (i.e., romanization in my understanding) but in English or mixed\nEnglish and Chinese. For those data in Latin Chinese, their quality seems to\nbe good.\n* 'zh': Many examples are porn-related, particularly those very long\ndocuments. Also, there are some examples of traditional Chinese.\n\n\nFinal Dataset information\n-------------------------\n\n\nThe number of documents, sentences, tokens, characters, and bytes for the noisy\nand clean splits of the data. Note that the \"toks\" field below uses whitespace\nfor tokenization, so is not appropriate for non-whitespace-separating languages\nlike Chinese (see section above). Note that the english subset in this version\nis missing 18% of documents that were included in the published analysis of the dataset.\nThese documents will be incoporated in an update coming soon.\n\n\n\n\n```\n@misc{kudugunta2023madlad400,\n title={MADLAD-400: A Multilingual And Document-Level Large Audited Dataset}, \n author={Sneha Kudugunta and Isaac Caswell and Biao Zhang and Xavier Garcia and Christopher A. Choquette-Choo and Katherine Lee and Derrick Xin and Aditya Kusupati and Romi Stella and Ankur Bapna and Orhan Firat},\n year={2023},\n eprint={2309.04662},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n\n```"
] |
[
"TAGS\n#task_categories-text-generation #size_categories-n>1T #license-odc-by #arxiv-2309.04662 #arxiv-2010.14571 #arxiv-2103.12028 #region-us \n",
"### Cursed Substrings\n\n\nBased on the initial round of data audits, the authors created a heuristic list of\nsubstrings and regexes accounting for a large amount of questionable content.\nKeep in mind that these all are fed into the 'pct\\_questionable' score -- a\nsentence is only excluded from the 'clean' dataset if over 20% of the sentences\nin that document are flagged as questionable.\n\n\nnotes about cursed substrings:\n\n\n* low quality sentences ending in the pipe character were very common. Before\nyou ask, this was not Devanagari-script text using a Danda.\n* The last few regexes are meant to match 'A N T S P E A K', 'List Case', and\nweirdly regular text (for instance, lists of shipping labels or country\ncodes)",
"### Virama Correction\n\n\nMany languages using Brahmic Abugida (South and Southeast Asian scripts like\nDevanagari, Khmer, etc.) use some variant on the virama character. For whatever\nreason, it was found that this character was often messed up in the common crawl\nsnapshots used. Therefore, for the languages 'bn my pa gu or ta te kn ml\nsi th tl mn lo bo km hi mr ne gom as jv dv bho dz hne ks\\_Deva mag mni shn yue zh\nja kjg mnw ksw rki mtr mwr xnr', a special correction step was done.\n\n\nFor these languages, the authors took the list of all virama characters and removed all\nunnecessary spaces between each instance of a virama character and the next\ncharacter with a regex.",
"### Myanmar Font Compatibility\n\n\nPrior to 2019, the most popular font for Burmese websites was the Zawgyi font.\nThe authors used Myanmar Tools to convert text.\n\n\nSeveral scripts, like the Chinese script, Tibetan script, and Thai, do not use\nwhitespace to separate characters. The languages with this property in this\ndataset are 'yue zh ja th lo kjg mnw my shn ksw rki km bo dz'.\n\n\nAlas, the Length aspect of the 'pct\\_questionable' score was calculated using\nsimplistic whitespace tokenization, and therefore rendered the whole\n'pct\\_questionable' score invalid for those languages. Therefore, for these\nlanguages, the \"clean\" data is identical to the \"noisy\" data (barring Chinese;\nsee below.)",
"### Special filters\n\n\nChinese had a particular issue with pornographic content. After manual inspection\na list of strings likely to be present in pornographic content was developed. All\npages containing at least one of these strings were removed. Resulted in 17%\nreduction in number of documents and 56% reduction in file size.\n\n\nA few more random notes, comparing to common alternative codes for these\nlanguages:\n\n\n* 'fil' for Filipino/Tagalog, not 'tl'\n* 'ak' for Twi/Akan, rather than 'tw'. This includes Fante.\n* Unfortunately use the macro code 'chm' for Meadow Mari (instead of the\ncorrect 'mhr'), and 'mrj' for Hill Mari\n* 'no' for Norwegian Bokmål, whereas some resources use\n'nb'\n* 'ps' for Pashto instead of 'pbt' (Southern Pashto)\n* 'ms' for Standard Malay, not 'zlm'\n* 'sq' for Albanian, and don't distinguish dialects like\nGheg ('aln') and Tosk ('als')\n* 'ber' as the code for Tamazight, after consultation with Tamazight\nspeakers opining that the dialect distinctions are not significant. Other\nresources use the individual codes like 'tzm' and 'kab'.\n* Macrocode 'qu' for Quechua. In practice, this seems usually to be\na mix of the Ayacucho and Cusco dialects. Other resources, like NLLB, may\nuse the dialect code, e.g. 'quy' for Ayacucho Chanka. The same is true for a\nfew other macro codes, like 'ff' (Macro code for Fulfulde, whereas other\nsources may use e.g. 'fuv'.)\n* Really, there are notes that can be made about almost any code, from the\nwell-accepted conventions like 'zh' for Mandarin, to many dialectical notes,\nlike which variant of Hmong really is the 'hmn' data? But the above ones are\nmade specifically for ones where the authors are aware of other datasources floating\nout there that use different conventions.\n\n\nAudit\n-----\n\n\nFollowing Quality at a Glance, the authors performed\nan \"audit\" of every corpus in this dataset. Although the authors did not speak most\nlanguages, they were able to give high-level comments on the general quality. They\nlooked at a sample of 20 documents of each language.\n\n\nAfter an initial round of auditing, they devised a new set of filters and applied\nthem. They then re-did all audits.",
"### Overall notes from the audit\n\n\nThe decision was to include languages that looked noisy, but omit any language\nthat was clearly majority noise, or only had 20 or fewer docs. This is a low\nbar -- twenty documents can be very little indeed, and some of the corpora released are quite noisy, but all of them should have at least the potential to\nbe used in some useful way. The motivation for not releasing nonsense or tiny\ndatasets is to not give a false sense of how multilingual this dataset actually\nis (\"Representation washing\"), as recommended by Quality at a Glance.\n\n\nA few overarching points:\n\n\n* Many low-resource languages only had Bible text, or in some cases URL\ndata. These are marked in the rows below. Generally 'ok bible' means that\n100% of the audited sentences were Biblical, whereas if 'bible' is simply\nmentioned in the note, it was not the only source of data.\n* Indian languages in the Latin script had a high concentration of\npornographic content.",
"### Renames and Merges as a result of the Audit\n\n\nIn several cases, it was clear from the audit that the corpora were not in the\nlanguages that the LangID model claimed they were. This led to the following\nrenames:\n\n\n* dty renamed to 'zxx-xx-dtynoise', aka a \"language\" of noise. This is mainly\nmis-rendered PDFs and may have some practical applications for decoding\nsaid.\n* 'fan' renamed to 'bum'\n* 'ss-SZ' renamed to 'ss' -- this was just a result of us having inconsistent\ndata labels.\n* 'cjk' merged into the 'gil' dataset\n* 'bjj' merged into the 'awa' dataset\n\n\nCanaries\n--------\n\n\nCanaries are provided in separate 'canaries' folder. Canaries are organized into three directions: 'monolingual' hosts canaries designed for the MADLAD-400 monody data, 'multiway' for the multiway data, and 'generic' the generic canaries generated only from the model's vocabulary.\n\n\n* Monolingual: Canaries here are organized by the language the canary was generated from. This corresponds exactly to the 'translate\\_copy' setting in the paper, where the source and target language match.\n* Multiway: Canaries here are organized in one of two fashions. 'to\\_XX' indicates canaries organized by the target language (and where the source language could be any language). 'XX-XX' indicates the canaries (interleaved\\_both and interleaved\\_mislabeled\\_both) designed for a specific pair of languages.\n\n\nWithin each subdirectory above, canaries are into separate files named by the canary type. There is always only a single file for each canary type. The 'generic' folder contains within it the four canary types.\n\n\nCanaries can be mixed in with normal training data to then be analyzed post-hoc to training\n\n\nReferences\n----------\n\n\nRaffel, Colin, et al. \"Exploring the limits of transfer learning with a unified\ntext-to-text transformer.\" J. Mach. Learn. Res. 21.140 (2020): 1-67.\n\n\nContact\n-------\n\n\nPlease reach out to {snehakudugunta, icaswell}꩜URL. For questions about the canaries, reach out to cchoquette@URL\n\n\nLicense\n-------\n\n\nThis data is released with the 'CC-BY-4.0' license.\n\n\nDetailed notes from the audit\n-----------------------------\n\n\nHere are the notes on all languages, along with the number of documents\nfound, and the final decision made with respect to including the language in\nthis dataset.\n\n\n\nA few comments too long to fit in the table above:\n\n\n* 'alt': WAIT THIS IS AMAZING IT IS ACTUALLY ALTAI! e.g. from urls like\nURL\n* 'tly-IR': They all look like boilerplate content, e.g., list of\nkeywords/search queries used to bump page ranking in search results. Not any\nuseful material for translation. Remove.\n* 'zap': pls note that at least some Zapotec speakers tend to view it as one\nlanguage, not as a million dialects like ISO does. However, some are\ncertainly mutually unintelligible, complicating the matter.\n* 'zh-Latn': The biggest problem is that several examples are not in Latin\nChinese (i.e., romanization in my understanding) but in English or mixed\nEnglish and Chinese. For those data in Latin Chinese, their quality seems to\nbe good.\n* 'zh': Many examples are porn-related, particularly those very long\ndocuments. Also, there are some examples of traditional Chinese.\n\n\nFinal Dataset information\n-------------------------\n\n\nThe number of documents, sentences, tokens, characters, and bytes for the noisy\nand clean splits of the data. Note that the \"toks\" field below uses whitespace\nfor tokenization, so is not appropriate for non-whitespace-separating languages\nlike Chinese (see section above). Note that the english subset in this version\nis missing 18% of documents that were included in the published analysis of the dataset.\nThese documents will be incoporated in an update coming soon.\n\n\n\n\n```\n@misc{kudugunta2023madlad400,\n title={MADLAD-400: A Multilingual And Document-Level Large Audited Dataset}, \n author={Sneha Kudugunta and Isaac Caswell and Biao Zhang and Xavier Garcia and Christopher A. Choquette-Choo and Katherine Lee and Derrick Xin and Aditya Kusupati and Romi Stella and Ankur Bapna and Orhan Firat},\n year={2023},\n eprint={2309.04662},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}\n\n```"
] |
[
59,
183,
190,
186,
574,
230,
1088
] |
[
"passage: TAGS\n#task_categories-text-generation #size_categories-n>1T #license-odc-by #arxiv-2309.04662 #arxiv-2010.14571 #arxiv-2103.12028 #region-us \n### Cursed Substrings\n\n\nBased on the initial round of data audits, the authors created a heuristic list of\nsubstrings and regexes accounting for a large amount of questionable content.\nKeep in mind that these all are fed into the 'pct\\_questionable' score -- a\nsentence is only excluded from the 'clean' dataset if over 20% of the sentences\nin that document are flagged as questionable.\n\n\nnotes about cursed substrings:\n\n\n* low quality sentences ending in the pipe character were very common. Before\nyou ask, this was not Devanagari-script text using a Danda.\n* The last few regexes are meant to match 'A N T S P E A K', 'List Case', and\nweirdly regular text (for instance, lists of shipping labels or country\ncodes)### Virama Correction\n\n\nMany languages using Brahmic Abugida (South and Southeast Asian scripts like\nDevanagari, Khmer, etc.) use some variant on the virama character. For whatever\nreason, it was found that this character was often messed up in the common crawl\nsnapshots used. Therefore, for the languages 'bn my pa gu or ta te kn ml\nsi th tl mn lo bo km hi mr ne gom as jv dv bho dz hne ks\\_Deva mag mni shn yue zh\nja kjg mnw ksw rki mtr mwr xnr', a special correction step was done.\n\n\nFor these languages, the authors took the list of all virama characters and removed all\nunnecessary spaces between each instance of a virama character and the next\ncharacter with a regex.",
"passage: ### Myanmar Font Compatibility\n\n\nPrior to 2019, the most popular font for Burmese websites was the Zawgyi font.\nThe authors used Myanmar Tools to convert text.\n\n\nSeveral scripts, like the Chinese script, Tibetan script, and Thai, do not use\nwhitespace to separate characters. The languages with this property in this\ndataset are 'yue zh ja th lo kjg mnw my shn ksw rki km bo dz'.\n\n\nAlas, the Length aspect of the 'pct\\_questionable' score was calculated using\nsimplistic whitespace tokenization, and therefore rendered the whole\n'pct\\_questionable' score invalid for those languages. Therefore, for these\nlanguages, the \"clean\" data is identical to the \"noisy\" data (barring Chinese;\nsee below.)",
"passage: ### Special filters\n\n\nChinese had a particular issue with pornographic content. After manual inspection\na list of strings likely to be present in pornographic content was developed. All\npages containing at least one of these strings were removed. Resulted in 17%\nreduction in number of documents and 56% reduction in file size.\n\n\nA few more random notes, comparing to common alternative codes for these\nlanguages:\n\n\n* 'fil' for Filipino/Tagalog, not 'tl'\n* 'ak' for Twi/Akan, rather than 'tw'. This includes Fante.\n* Unfortunately use the macro code 'chm' for Meadow Mari (instead of the\ncorrect 'mhr'), and 'mrj' for Hill Mari\n* 'no' for Norwegian Bokmål, whereas some resources use\n'nb'\n* 'ps' for Pashto instead of 'pbt' (Southern Pashto)\n* 'ms' for Standard Malay, not 'zlm'\n* 'sq' for Albanian, and don't distinguish dialects like\nGheg ('aln') and Tosk ('als')\n* 'ber' as the code for Tamazight, after consultation with Tamazight\nspeakers opining that the dialect distinctions are not significant. Other\nresources use the individual codes like 'tzm' and 'kab'.\n* Macrocode 'qu' for Quechua. In practice, this seems usually to be\na mix of the Ayacucho and Cusco dialects. Other resources, like NLLB, may\nuse the dialect code, e.g. 'quy' for Ayacucho Chanka. The same is true for a\nfew other macro codes, like 'ff' (Macro code for Fulfulde, whereas other\nsources may use e.g. 'fuv'.)\n* Really, there are notes that can be made about almost any code, from the\nwell-accepted conventions like 'zh' for Mandarin, to many dialectical notes,\nlike which variant of Hmong really is the 'hmn' data? But the above ones are\nmade specifically for ones where the authors are aware of other datasources floating\nout there that use different conventions.\n\n\nAudit\n-----\n\n\nFollowing Quality at a Glance, the authors performed\nan \"audit\" of every corpus in this dataset. Although the authors did not speak most\nlanguages, they were able to give high-level comments on the general quality. They\nlooked at a sample of 20 documents of each language.\n\n\nAfter an initial round of auditing, they devised a new set of filters and applied\nthem. They then re-did all audits.### Overall notes from the audit\n\n\nThe decision was to include languages that looked noisy, but omit any language\nthat was clearly majority noise, or only had 20 or fewer docs. This is a low\nbar -- twenty documents can be very little indeed, and some of the corpora released are quite noisy, but all of them should have at least the potential to\nbe used in some useful way. The motivation for not releasing nonsense or tiny\ndatasets is to not give a false sense of how multilingual this dataset actually\nis (\"Representation washing\"), as recommended by Quality at a Glance.\n\n\nA few overarching points:\n\n\n* Many low-resource languages only had Bible text, or in some cases URL\ndata. These are marked in the rows below. Generally 'ok bible' means that\n100% of the audited sentences were Biblical, whereas if 'bible' is simply\nmentioned in the note, it was not the only source of data.\n* Indian languages in the Latin script had a high concentration of\npornographic content."
] |
f88acdb17eeae6abc67f1b3157607c2c87800ae6
|
# Dataset Card for Evaluation run of Undi95/UndiMix-v1-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/UndiMix-v1-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/UndiMix-v1-13b](https://huggingface.co/Undi95/UndiMix-v1-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__UndiMix-v1-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T16:31:03.720074](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__UndiMix-v1-13b/blob/main/results_2023-10-16T16-31-03.720074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2600671140939597,
"em_stderr": 0.004492401208347132,
"f1": 0.34945260067114264,
"f1_stderr": 0.004422869896423944,
"acc": 0.42730704720576507,
"acc_stderr": 0.010180773732934644
},
"harness|drop|3": {
"em": 0.2600671140939597,
"em_stderr": 0.004492401208347132,
"f1": 0.34945260067114264,
"f1_stderr": 0.004422869896423944
},
"harness|gsm8k|5": {
"acc": 0.10007581501137225,
"acc_stderr": 0.008266274528685646
},
"harness|winogrande|5": {
"acc": 0.7545382794001578,
"acc_stderr": 0.012095272937183644
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Undi95__UndiMix-v1-13b
|
[
"region:us"
] |
2023-08-31T23:11:10+00:00
|
{"pretty_name": "Evaluation run of Undi95/UndiMix-v1-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Undi95/UndiMix-v1-13b](https://huggingface.co/Undi95/UndiMix-v1-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__UndiMix-v1-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T16:31:03.720074](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__UndiMix-v1-13b/blob/main/results_2023-10-16T16-31-03.720074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2600671140939597,\n \"em_stderr\": 0.004492401208347132,\n \"f1\": 0.34945260067114264,\n \"f1_stderr\": 0.004422869896423944,\n \"acc\": 0.42730704720576507,\n \"acc_stderr\": 0.010180773732934644\n },\n \"harness|drop|3\": {\n \"em\": 0.2600671140939597,\n \"em_stderr\": 0.004492401208347132,\n \"f1\": 0.34945260067114264,\n \"f1_stderr\": 0.004422869896423944\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10007581501137225,\n \"acc_stderr\": 0.008266274528685646\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7545382794001578,\n \"acc_stderr\": 0.012095272937183644\n }\n}\n```", "repo_url": "https://huggingface.co/Undi95/UndiMix-v1-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|arc:challenge|25_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T16_31_03.720074", "path": ["**/details_harness|drop|3_2023-10-16T16-31-03.720074.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T16-31-03.720074.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T16_31_03.720074", "path": ["**/details_harness|gsm8k|5_2023-10-16T16-31-03.720074.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T16-31-03.720074.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hellaswag|10_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T00:10:45.842963.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T00:10:45.842963.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T00:10:45.842963.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T16_31_03.720074", "path": ["**/details_harness|winogrande|5_2023-10-16T16-31-03.720074.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T16-31-03.720074.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_01T00_10_45.842963", "path": ["results_2023-09-01T00:10:45.842963.parquet"]}, {"split": "2023_10_16T16_31_03.720074", "path": ["results_2023-10-16T16-31-03.720074.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T16-31-03.720074.parquet"]}]}]}
|
2023-10-16T15:31:16+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Undi95/UndiMix-v1-13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Undi95/UndiMix-v1-13b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-16T16:31:03.720074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Undi95/UndiMix-v1-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/UndiMix-v1-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T16:31:03.720074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Undi95/UndiMix-v1-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/UndiMix-v1-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T16:31:03.720074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Undi95/UndiMix-v1-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/UndiMix-v1-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T16:31:03.720074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b3e8230da6eaa37cdedd62cd0f1b50e1cb36d076
|
# AutoTrain Dataset for project: vape-chat
## Dataset Description
This dataset has been automatically processed by AutoTrain for project vape-chat.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"context": "\u2020###Human:\ngenerate a help prompt for What's in vape juice?\n\n###Assistant:\nVape juice, or e-liquid, typically contains propylene glycol (PG), vegetable glycerin (VG), nicotine, and flavorings. However, there are nicotine-free versions available.",
"question": "What's in vape juice?",
"answers.text": [
"Vape juice, or e-liquid, typically contains propylene glycol (PG), vegetable glycerin (VG), nicotine, and flavorings. However, there are nicotine-free versions available."
],
"answers.answer_start": [
75
]
},
{
"context": "\u2020###Human:\ngenerate a help prompt for What's in vape juice? (version 6)\n\n###Assistant:\nVape juice, or e-liquid, typically contains propylene glycol (PG), vegetable glycerin (VG), nicotine, and flavorings. However, there are nicotine-free versions available. (version 6)",
"question": "What's in vape juice? (version 6)",
"answers.text": [
"Vape juice, or e-liquid, typically contains propylene glycol (PG), vegetable glycerin (VG), nicotine, and flavorings. However, there are nicotine-free versions available. (version 6)"
],
"answers.answer_start": [
87
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"context": "Value(dtype='string', id=None)",
"question": "Value(dtype='string', id=None)",
"answers.text": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"answers.answer_start": "Sequence(feature=Value(dtype='int32', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 64 |
| valid | 16 |
|
ridenight/autotrain-data-vape-chat
|
[
"language:en",
"region:us"
] |
2023-08-31T23:39:52+00:00
|
{"language": ["en"]}
|
2023-08-31T23:42:38+00:00
|
[] |
[
"en"
] |
TAGS
#language-English #region-us
|
AutoTrain Dataset for project: vape-chat
========================================
Dataset Description
-------------------
This dataset has been automatically processed by AutoTrain for project vape-chat.
### Languages
The BCP-47 code for the dataset's language is en.
Dataset Structure
-----------------
### Data Instances
A sample from this dataset looks as follows:
### Dataset Fields
The dataset has the following fields (also called "features"):
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
|
[
"### Languages\n\n\nThe BCP-47 code for the dataset's language is en.\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nA sample from this dataset looks as follows:",
"### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):",
"### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:"
] |
[
"TAGS\n#language-English #region-us \n",
"### Languages\n\n\nThe BCP-47 code for the dataset's language is en.\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nA sample from this dataset looks as follows:",
"### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):",
"### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:"
] |
[
10,
26,
17,
23,
27
] |
[
"passage: TAGS\n#language-English #region-us \n### Languages\n\n\nThe BCP-47 code for the dataset's language is en.\n\n\nDataset Structure\n-----------------### Data Instances\n\n\nA sample from this dataset looks as follows:### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:"
] |
5142689a3888786db7d376e60c16c5c1e7201191
|
This repo provides part of the dataset used for PMC-LLaMA-13B's instruction tuning.
| Data | Size | Link |
| --- | --- | --- |
| ChatDoctor | 100K | https://www.yunxiangli.top/ChatDoctor/ |
| MedQA | 10.2K | https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options |
| MedMCQA | 183K | https://huggingface.co/datasets/medmcqa |
| PubmedQA | 211K | https://huggingface.co/datasets/pubmed_qa |
| LiveQA | 635 | https://huggingface.co/datasets/truehealth/liveqa |
| MedicationQA | 690 | https://huggingface.co/datasets/truehealth/medicationqa |
| UMLS | 99K | https://www.nlm.nih.gov/research/umls/index.html |
The whole instruction dataset is composed of 7 parts. We have covered them in this dataset repo except for *ChatDoctor*.
You should consider merge ChatDoctor's data for complete dataset.
|
axiong/pmc_llama_instructions
|
[
"task_categories:question-answering",
"task_categories:text-generation",
"language:en",
"license:openrail",
"biology",
"med",
"region:us"
] |
2023-08-31T23:56:32+00:00
|
{"language": ["en"], "license": "openrail", "task_categories": ["question-answering", "text-generation"], "tags": ["biology", "med"]}
|
2023-11-23T08:47:30+00:00
|
[] |
[
"en"
] |
TAGS
#task_categories-question-answering #task_categories-text-generation #language-English #license-openrail #biology #med #region-us
|
This repo provides part of the dataset used for PMC-LLaMA-13B's instruction tuning.
Data: ChatDoctor, Size: 100K, Link: URL
Data: MedQA, Size: 10.2K, Link: URL
Data: MedMCQA, Size: 183K, Link: URL
Data: PubmedQA, Size: 211K, Link: URL
Data: LiveQA, Size: 635, Link: URL
Data: MedicationQA, Size: 690, Link: URL
Data: UMLS, Size: 99K, Link: URL
The whole instruction dataset is composed of 7 parts. We have covered them in this dataset repo except for *ChatDoctor*.
You should consider merge ChatDoctor's data for complete dataset.
|
[] |
[
"TAGS\n#task_categories-question-answering #task_categories-text-generation #language-English #license-openrail #biology #med #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-question-answering #task_categories-text-generation #language-English #license-openrail #biology #med #region-us \n"
] |
713f99b91ab9dbbbc0e18ccbcafdff3ee709ba4b
|
# Asclepius: Synthetic Clincal Notes & Instruction Dataset
## Dataset Description
- **Repository:** [Github](https://github.com/starmpcc/Asclepius)
- **Paper:** https://arxiv.org/abs/2309.00237
### Dataset Summary
This dataset is official dataset for Asclepius [(arxiv)](https://arxiv.org/abs/2309.00237)
This dataset is composed with Clinical Note - Question - Answer format to build a clinical LLMs.
- We first synthesized synthetic notes from [PMC-Patients](https://huggingface.co/datasets/zhengyun21/PMC-Patients) case reports with GPT-3.5
- Then, we generate instruction-answer pairs for 157k synthetic discharge summaries
### Supported Tasks
- This dataset covers below 8 tasks
- Named Entity Recognition
- Abbreviation Expansion
- Relation Extraction
- Temporal Information Extraction
- Coreference Resolution
- Paraphrasing
- Summarization
- Question Answering
### Languages
English
## Dataset Structure
### Data Instances
- `synthetic.csv`
- Clinical Note - Question - Answer pairs
### Data Fields
- `patient_id`: Unique case report id from PMC-Patients
- `patient`: Case report text
- `question`: GPT-3.5 generated instruction from patient. The used prompt can be checked on github.
- `answer`: GPT-3.5 generated answer for given case report and question
- `task`: Corresponding category of question. One of above listsed
## Dataset Creation
### Source Data
[PMC-Patients](https://huggingface.co/datasets/zhengyun21/PMC-Patients)
### Annotations
We used GPT-3.5-turbo (version 0314).
You can check the prompts on our github.
## Additional Information
### Models
- [Asclepius-7B](https://huggingface.co/starmpcc/Asclepius-7B)
- [Asclepius-13B](https://huggingface.co/starmpcc/Asclepius-13B)
- [Asclepius-Llama2-7B](https://huggingface.co/starmpcc/Asclepius-Llama2-7B)
- [Asclepius-Llama2-13B](https://huggingface.co/starmpcc/Asclepius-Llama2-13B)
### Variants
- The instruction-answer pairs generated from MIMIC-III discharge summaries and the models trained with them are now available on [Physionet](https://physionet.org/content/asclepius-r/1.0.0/)!
### Licensing Information
CC-BY-NC-SA 4.0
### Citation Information
```
@misc{kweon2023publicly,
title={Publicly Shareable Clinical Large Language Model Built on Synthetic Clinical Notes},
author={Sunjun Kweon and Junu Kim and Jiyoun Kim and Sujeong Im and Eunbyeol Cho and Seongsu Bae and Jungwoo Oh and Gyubok Lee and Jong Hak Moon and Seng Chan You and Seungjin Baek and Chang Hoon Han and Yoon Bin Jung and Yohan Jo and Edward Choi},
year={2023},
eprint={2309.00237},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
starmpcc/Asclepius-Synthetic-Clinical-Notes
|
[
"task_categories:question-answering",
"task_categories:summarization",
"task_categories:text-generation",
"task_categories:conversational",
"size_categories:100K<n<1M",
"language:en",
"license:cc-by-nc-sa-4.0",
"medical",
"synthetic",
"arxiv:2309.00237",
"region:us"
] |
2023-09-01T00:47:59+00:00
|
{"language": ["en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["question-answering", "summarization", "text-generation", "conversational"], "pretty_name": "Asclepius: Synthetic Clincal Notes & Instruction Dataset", "tags": ["medical", "synthetic"]}
|
2024-02-05T07:07:50+00:00
|
[
"2309.00237"
] |
[
"en"
] |
TAGS
#task_categories-question-answering #task_categories-summarization #task_categories-text-generation #task_categories-conversational #size_categories-100K<n<1M #language-English #license-cc-by-nc-sa-4.0 #medical #synthetic #arxiv-2309.00237 #region-us
|
# Asclepius: Synthetic Clincal Notes & Instruction Dataset
## Dataset Description
- Repository: Github
- Paper: URL
### Dataset Summary
This dataset is official dataset for Asclepius (arxiv)
This dataset is composed with Clinical Note - Question - Answer format to build a clinical LLMs.
- We first synthesized synthetic notes from PMC-Patients case reports with GPT-3.5
- Then, we generate instruction-answer pairs for 157k synthetic discharge summaries
### Supported Tasks
- This dataset covers below 8 tasks
- Named Entity Recognition
- Abbreviation Expansion
- Relation Extraction
- Temporal Information Extraction
- Coreference Resolution
- Paraphrasing
- Summarization
- Question Answering
### Languages
English
## Dataset Structure
### Data Instances
- 'URL'
- Clinical Note - Question - Answer pairs
### Data Fields
- 'patient_id': Unique case report id from PMC-Patients
- 'patient': Case report text
- 'question': GPT-3.5 generated instruction from patient. The used prompt can be checked on github.
- 'answer': GPT-3.5 generated answer for given case report and question
- 'task': Corresponding category of question. One of above listsed
## Dataset Creation
### Source Data
PMC-Patients
### Annotations
We used GPT-3.5-turbo (version 0314).
You can check the prompts on our github.
## Additional Information
### Models
- Asclepius-7B
- Asclepius-13B
- Asclepius-Llama2-7B
- Asclepius-Llama2-13B
### Variants
- The instruction-answer pairs generated from MIMIC-III discharge summaries and the models trained with them are now available on Physionet!
### Licensing Information
CC-BY-NC-SA 4.0
|
[
"# Asclepius: Synthetic Clincal Notes & Instruction Dataset",
"## Dataset Description\n- Repository: Github\n- Paper: URL",
"### Dataset Summary\n\nThis dataset is official dataset for Asclepius (arxiv)\nThis dataset is composed with Clinical Note - Question - Answer format to build a clinical LLMs.\n- We first synthesized synthetic notes from PMC-Patients case reports with GPT-3.5\n- Then, we generate instruction-answer pairs for 157k synthetic discharge summaries",
"### Supported Tasks\n\n- This dataset covers below 8 tasks\n - Named Entity Recognition\n - Abbreviation Expansion\n - Relation Extraction\n - Temporal Information Extraction\n - Coreference Resolution\n - Paraphrasing\n - Summarization\n - Question Answering",
"### Languages\n\nEnglish",
"## Dataset Structure",
"### Data Instances\n\n- 'URL'\n - Clinical Note - Question - Answer pairs",
"### Data Fields\n\n- 'patient_id': Unique case report id from PMC-Patients\n- 'patient': Case report text\n- 'question': GPT-3.5 generated instruction from patient. The used prompt can be checked on github.\n- 'answer': GPT-3.5 generated answer for given case report and question\n- 'task': Corresponding category of question. One of above listsed",
"## Dataset Creation",
"### Source Data\nPMC-Patients",
"### Annotations\nWe used GPT-3.5-turbo (version 0314).\nYou can check the prompts on our github.",
"## Additional Information",
"### Models\n - Asclepius-7B\n - Asclepius-13B\n - Asclepius-Llama2-7B\n - Asclepius-Llama2-13B",
"### Variants\n - The instruction-answer pairs generated from MIMIC-III discharge summaries and the models trained with them are now available on Physionet!",
"### Licensing Information\n\nCC-BY-NC-SA 4.0"
] |
[
"TAGS\n#task_categories-question-answering #task_categories-summarization #task_categories-text-generation #task_categories-conversational #size_categories-100K<n<1M #language-English #license-cc-by-nc-sa-4.0 #medical #synthetic #arxiv-2309.00237 #region-us \n",
"# Asclepius: Synthetic Clincal Notes & Instruction Dataset",
"## Dataset Description\n- Repository: Github\n- Paper: URL",
"### Dataset Summary\n\nThis dataset is official dataset for Asclepius (arxiv)\nThis dataset is composed with Clinical Note - Question - Answer format to build a clinical LLMs.\n- We first synthesized synthetic notes from PMC-Patients case reports with GPT-3.5\n- Then, we generate instruction-answer pairs for 157k synthetic discharge summaries",
"### Supported Tasks\n\n- This dataset covers below 8 tasks\n - Named Entity Recognition\n - Abbreviation Expansion\n - Relation Extraction\n - Temporal Information Extraction\n - Coreference Resolution\n - Paraphrasing\n - Summarization\n - Question Answering",
"### Languages\n\nEnglish",
"## Dataset Structure",
"### Data Instances\n\n- 'URL'\n - Clinical Note - Question - Answer pairs",
"### Data Fields\n\n- 'patient_id': Unique case report id from PMC-Patients\n- 'patient': Case report text\n- 'question': GPT-3.5 generated instruction from patient. The used prompt can be checked on github.\n- 'answer': GPT-3.5 generated answer for given case report and question\n- 'task': Corresponding category of question. One of above listsed",
"## Dataset Creation",
"### Source Data\nPMC-Patients",
"### Annotations\nWe used GPT-3.5-turbo (version 0314).\nYou can check the prompts on our github.",
"## Additional Information",
"### Models\n - Asclepius-7B\n - Asclepius-13B\n - Asclepius-Llama2-7B\n - Asclepius-Llama2-13B",
"### Variants\n - The instruction-answer pairs generated from MIMIC-III discharge summaries and the models trained with them are now available on Physionet!",
"### Licensing Information\n\nCC-BY-NC-SA 4.0"
] |
[
93,
19,
16,
93,
59,
5,
6,
20,
97,
5,
10,
30,
5,
40,
40,
14
] |
[
"passage: TAGS\n#task_categories-question-answering #task_categories-summarization #task_categories-text-generation #task_categories-conversational #size_categories-100K<n<1M #language-English #license-cc-by-nc-sa-4.0 #medical #synthetic #arxiv-2309.00237 #region-us \n# Asclepius: Synthetic Clincal Notes & Instruction Dataset## Dataset Description\n- Repository: Github\n- Paper: URL### Dataset Summary\n\nThis dataset is official dataset for Asclepius (arxiv)\nThis dataset is composed with Clinical Note - Question - Answer format to build a clinical LLMs.\n- We first synthesized synthetic notes from PMC-Patients case reports with GPT-3.5\n- Then, we generate instruction-answer pairs for 157k synthetic discharge summaries### Supported Tasks\n\n- This dataset covers below 8 tasks\n - Named Entity Recognition\n - Abbreviation Expansion\n - Relation Extraction\n - Temporal Information Extraction\n - Coreference Resolution\n - Paraphrasing\n - Summarization\n - Question Answering### Languages\n\nEnglish## Dataset Structure### Data Instances\n\n- 'URL'\n - Clinical Note - Question - Answer pairs### Data Fields\n\n- 'patient_id': Unique case report id from PMC-Patients\n- 'patient': Case report text\n- 'question': GPT-3.5 generated instruction from patient. The used prompt can be checked on github.\n- 'answer': GPT-3.5 generated answer for given case report and question\n- 'task': Corresponding category of question. One of above listsed## Dataset Creation### Source Data\nPMC-Patients### Annotations\nWe used GPT-3.5-turbo (version 0314).\nYou can check the prompts on our github.## Additional Information### Models\n - Asclepius-7B\n - Asclepius-13B\n - Asclepius-Llama2-7B\n - Asclepius-Llama2-13B"
] |
63aa8508ce031392d2407c8eab3e3b965c834bb1
|
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_FINETUNE2_3w",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T18:08:49.562791](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_FINETUNE2_3w/blob/main/results_2023-10-16T18-08-49.562791.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19316275167785235,
"em_stderr": 0.004042912227684817,
"f1": 0.2405106963087243,
"f1_stderr": 0.004012764038516629,
"acc": 0.44387175953656505,
"acc_stderr": 0.010404181547690496
},
"harness|drop|3": {
"em": 0.19316275167785235,
"em_stderr": 0.004042912227684817,
"f1": 0.2405106963087243,
"f1_stderr": 0.004012764038516629
},
"harness|gsm8k|5": {
"acc": 0.1197877179681577,
"acc_stderr": 0.008944213403553045
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827948
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_FINETUNE2_3w
|
[
"region:us"
] |
2023-09-01T01:01:02+00:00
|
{"pretty_name": "Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w", "dataset_summary": "Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_FINETUNE2_3w\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T18:08:49.562791](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_FINETUNE2_3w/blob/main/results_2023-10-16T18-08-49.562791.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19316275167785235,\n \"em_stderr\": 0.004042912227684817,\n \"f1\": 0.2405106963087243,\n \"f1_stderr\": 0.004012764038516629,\n \"acc\": 0.44387175953656505,\n \"acc_stderr\": 0.010404181547690496\n },\n \"harness|drop|3\": {\n \"em\": 0.19316275167785235,\n \"em_stderr\": 0.004042912227684817,\n \"f1\": 0.2405106963087243,\n \"f1_stderr\": 0.004012764038516629\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1197877179681577,\n \"acc_stderr\": 0.008944213403553045\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827948\n }\n}\n```", "repo_url": "https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|arc:challenge|25_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T18_08_49.562791", "path": ["**/details_harness|drop|3_2023-10-16T18-08-49.562791.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T18-08-49.562791.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T18_08_49.562791", "path": ["**/details_harness|gsm8k|5_2023-10-16T18-08-49.562791.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T18-08-49.562791.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hellaswag|10_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T02:00:37.235761.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T02:00:37.235761.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T02:00:37.235761.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T18_08_49.562791", "path": ["**/details_harness|winogrande|5_2023-10-16T18-08-49.562791.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T18-08-49.562791.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_01T02_00_37.235761", "path": ["results_2023-09-01T02:00:37.235761.parquet"]}, {"split": "2023_10_16T18_08_49.562791", "path": ["results_2023-10-16T18-08-49.562791.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T18-08-49.562791.parquet"]}]}]}
|
2023-10-16T17:09:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-16T18:08:49.562791(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T18:08:49.562791(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T18:08:49.562791(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
32,
31,
180,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-huangyt_FINETUNE2_3w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T18:08:49.562791(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
16c8055fce532412830a7391485f579397f07fc7
|
# AutoTrain Dataset for project: demo-2
## Dataset Description
This dataset has been automatically processed by AutoTrain for project demo-2.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"context": "In the late 1970s many arterial roads were redesigned as ejes viales; high-volume one-way roads that cross, in theory, Mexico City proper from side to side. The eje vial network is based on a quasi-Cartesian grid, with the ejes themselves being called Eje 1 Poniente, Eje Central, and Eje 1 Oriente, for example, for the north-south roads, and Eje 2 Sur and Eje 3 Norte, for example, for east-west roads. Ring roads are the Circuito Interior (inner ring), Anillo Perif\u00e9rico; the Circuito Exterior Mexiquense (\"State of Mexico outer loop\") toll road skirting the northeastern and eastern edges of the metropolitan area, the Chamapa-La Venta toll road skirting the northwestern edge, and the Arco Norte completely bypassing the metropolitan area in an arc from northwest (Atlacomulco) to north (Tula, Hidalgo) to east (Puebla). A second level (where tolls are charged) of the Perif\u00e9rico, colloquially called the segundo piso (\"second floor\"), was officially opened in 2012, with sections still being completed. The Viaducto Miguel Alem\u00e1n crosses the city east-west from Observatorio to the airport. In 2013 the Superv\u00eda Poniente opened, a toll road linking the new Santa Fe business district with southwestern Mexico City.",
"question": "When were these second level roads opened?",
"answers.text": [
"2012"
],
"answers.answer_start": [
966
],
"feat_id": [
"572694bef1498d1400e8e468"
],
"feat_title": [
"Mexico_City"
]
},
{
"context": "The first Code of Canon Law, 1917, was mostly for the Roman Rite, with limited application to the Eastern Churches. After the Second Vatican Council, (1962 - 1965), another edition was published specifically for the Roman Rite in 1983. Most recently, 1990, the Vatican produced the Code of Canons of the Eastern Churches which became the 1st code of Eastern Catholic Canon Law.",
"question": "For which part of the Roman Catholic Church was the first Code published?",
"answers.text": [
"the Roman Rite"
],
"answers.answer_start": [
50
],
"feat_id": [
"56e1040ecd28a01900c6743e"
],
"feat_title": [
"Canon_law"
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"context": "Value(dtype='string', id=None)",
"question": "Value(dtype='string', id=None)",
"answers.text": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"answers.answer_start": "Sequence(feature=Value(dtype='int32', id=None), length=-1, id=None)",
"feat_id": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"feat_title": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 87433 |
| valid | 10546 |
|
Aeonai/autotrain-data-demo-2
|
[
"language:en",
"region:us"
] |
2023-09-01T01:06:21+00:00
|
{"language": ["en"]}
|
2023-09-01T01:12:18+00:00
|
[] |
[
"en"
] |
TAGS
#language-English #region-us
|
AutoTrain Dataset for project: demo-2
=====================================
Dataset Description
-------------------
This dataset has been automatically processed by AutoTrain for project demo-2.
### Languages
The BCP-47 code for the dataset's language is en.
Dataset Structure
-----------------
### Data Instances
A sample from this dataset looks as follows:
### Dataset Fields
The dataset has the following fields (also called "features"):
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
|
[
"### Languages\n\n\nThe BCP-47 code for the dataset's language is en.\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nA sample from this dataset looks as follows:",
"### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):",
"### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:"
] |
[
"TAGS\n#language-English #region-us \n",
"### Languages\n\n\nThe BCP-47 code for the dataset's language is en.\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nA sample from this dataset looks as follows:",
"### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):",
"### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:"
] |
[
10,
26,
17,
23,
27
] |
[
"passage: TAGS\n#language-English #region-us \n### Languages\n\n\nThe BCP-47 code for the dataset's language is en.\n\n\nDataset Structure\n-----------------### Data Instances\n\n\nA sample from this dataset looks as follows:### Dataset Fields\n\n\nThe dataset has the following fields (also called \"features\"):### Dataset Splits\n\n\nThis dataset is split into a train and validation split. The split sizes are as follow:"
] |
529d16ab9ad4932c162e96db360b401616b6c0c7
|
数据来自百度Dureader_Retrieval。
数据的格式根据训练ColBERT所需要的格式转化而来
包括以下三个文件:
|
Eson-llm/Dureader_Retrierval_ColBERTFormat
|
[
"license:mit",
"region:us"
] |
2023-09-01T01:21:24+00:00
|
{"license": "mit"}
|
2023-09-01T06:12:44+00:00
|
[] |
[] |
TAGS
#license-mit #region-us
|
数据来自百度Dureader_Retrieval。
数据的格式根据训练ColBERT所需要的格式转化而来
包括以下三个文件:
|
[] |
[
"TAGS\n#license-mit #region-us \n"
] |
[
11
] |
[
"passage: TAGS\n#license-mit #region-us \n"
] |
0556084cc29ef869674493070fd58900a8a6e666
|
# Dataset Card for "llama_2_product_titles-esci_test-sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
qazisaad/llama_2_product_titles-esci_test-sft
|
[
"region:us"
] |
2023-09-01T01:53:24+00:00
|
{"dataset_info": {"features": [{"name": "index", "dtype": "int64"}, {"name": "query", "dtype": "string"}, {"name": "average_score", "dtype": "float64"}, {"name": "total_score", "dtype": "float64"}, {"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 4761528, "num_examples": 13996}], "download_size": 1243412, "dataset_size": 4761528}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-01T01:53:25+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "llama_2_product_titles-esci_test-sft"
More Information needed
|
[
"# Dataset Card for \"llama_2_product_titles-esci_test-sft\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"llama_2_product_titles-esci_test-sft\"\n\nMore Information needed"
] |
[
6,
27
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"llama_2_product_titles-esci_test-sft\"\n\nMore Information needed"
] |
efc7d5b4f795a9d8c6475ae14ea1cc39f9ec9ad7
|
# Dataset Card for "llama_2-product-titles-esci-test-sft-temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
qazisaad/llama_2-product-titles-esci-test-sft-temp
|
[
"region:us"
] |
2023-09-01T01:59:13+00:00
|
{"dataset_info": {"features": [{"name": "index", "dtype": "int64"}, {"name": "query", "dtype": "string"}, {"name": "average_score", "dtype": "float64"}, {"name": "total_score", "dtype": "float64"}, {"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}, {"name": "preds", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 5401147, "num_examples": 13996}], "download_size": 1569052, "dataset_size": 5401147}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-01T12:28:15+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "llama_2-product-titles-esci-test-sft-temp"
More Information needed
|
[
"# Dataset Card for \"llama_2-product-titles-esci-test-sft-temp\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"llama_2-product-titles-esci-test-sft-temp\"\n\nMore Information needed"
] |
[
6,
28
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"llama_2-product-titles-esci-test-sft-temp\"\n\nMore Information needed"
] |
e3ca05c22cd1253185f3053ef68413164cd7747b
|
# Dataset Card for "llama_2_optimized_product_titles-esci-test-sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
qazisaad/llama_2_optimized_product_titles-esci-test-sft
|
[
"region:us"
] |
2023-09-01T02:29:45+00:00
|
{"dataset_info": {"features": [{"name": "index", "dtype": "int64"}, {"name": "product_title", "dtype": "string"}, {"name": "average_score", "dtype": "float64"}, {"name": "total_score", "dtype": "float64"}, {"name": "text", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 4827482, "num_examples": 11924}], "download_size": 2588134, "dataset_size": 4827482}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-01T02:29:46+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "llama_2_optimized_product_titles-esci-test-sft"
More Information needed
|
[
"# Dataset Card for \"llama_2_optimized_product_titles-esci-test-sft\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"llama_2_optimized_product_titles-esci-test-sft\"\n\nMore Information needed"
] |
[
6,
30
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"llama_2_optimized_product_titles-esci-test-sft\"\n\nMore Information needed"
] |
d4a3e002d8aef722f698c0be7fae7572d68d7971
|
# Dataset Card for "llama-2-optimized-product-titles-esci-test-sft-temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
qazisaad/llama-2-optimized-product-titles-esci-test-sft-temp
|
[
"region:us"
] |
2023-09-01T02:31:18+00:00
|
{"dataset_info": {"features": [{"name": "index", "dtype": "int64"}, {"name": "product_title", "dtype": "string"}, {"name": "average_score", "dtype": "float64"}, {"name": "total_score", "dtype": "float64"}, {"name": "text", "dtype": "string"}, {"name": "preds", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2224488, "num_examples": 3720}], "download_size": 1243164, "dataset_size": 2224488}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-01T10:22:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "llama-2-optimized-product-titles-esci-test-sft-temp"
More Information needed
|
[
"# Dataset Card for \"llama-2-optimized-product-titles-esci-test-sft-temp\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"llama-2-optimized-product-titles-esci-test-sft-temp\"\n\nMore Information needed"
] |
[
6,
31
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"llama-2-optimized-product-titles-esci-test-sft-temp\"\n\nMore Information needed"
] |
aaa41dff678e551cc111ecfe4e389107d559ce0f
|
# Dataset Card for "autotree_pmlb_ring_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
yzhuang/autotree_pmlb_ring_sgosdt_l256_d3_sd0
|
[
"region:us"
] |
2023-09-01T02:46:27+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "input_x", "sequence": {"sequence": "float32"}}, {"name": "input_y", "sequence": {"sequence": "float32"}}, {"name": "rtg", "sequence": "float64"}, {"name": "status", "sequence": {"sequence": "float32"}}, {"name": "split_threshold", "sequence": {"sequence": "float32"}}, {"name": "split_dimension", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 308080000, "num_examples": 10000}, {"name": "validation", "num_bytes": 308080000, "num_examples": 10000}], "download_size": 206130038, "dataset_size": 616160000}}
|
2023-09-01T02:46:41+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "autotree_pmlb_ring_sgosdt_l256_d3_sd0"
More Information needed
|
[
"# Dataset Card for \"autotree_pmlb_ring_sgosdt_l256_d3_sd0\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"autotree_pmlb_ring_sgosdt_l256_d3_sd0\"\n\nMore Information needed"
] |
[
6,
31
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"autotree_pmlb_ring_sgosdt_l256_d3_sd0\"\n\nMore Information needed"
] |
75b46599cc05280142df8a2efd901a38602af6cc
|
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
zhangzhang9999/code
|
[
"region:us"
] |
2023-09-01T02:59:59+00:00
|
{}
|
2023-09-01T15:37:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Dataset Name
## Dataset Description
- Homepage:
- Repository:
- Paper:
- Leaderboard:
- Point of Contact:
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
8,
24,
32,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
f505da68e43bc45b99d08f8d6e839792167b6670
|
# Dataset Card for Evaluation run of TheBloke/Airoboros-L2-70B-2.1-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Airoboros-L2-70B-2.1-GPTQ](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Airoboros-L2-70B-2.1-GPTQ_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T02:26:46.433766](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Airoboros-L2-70B-2.1-GPTQ_public/blob/main/results_2023-11-08T02-26-46.433766.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4241820469798658,
"em_stderr": 0.0050612570385902955,
"f1": 0.5410476090604083,
"f1_stderr": 0.004613044422574753,
"acc": 0.48424459945200166,
"acc_stderr": 0.010393744134050047
},
"harness|drop|3": {
"em": 0.4241820469798658,
"em_stderr": 0.0050612570385902955,
"f1": 0.5410476090604083,
"f1_stderr": 0.004613044422574753
},
"harness|gsm8k|5": {
"acc": 0.15238817285822592,
"acc_stderr": 0.009899572254794198
},
"harness|winogrande|5": {
"acc": 0.8161010260457774,
"acc_stderr": 0.010887916013305896
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__Airoboros-L2-70B-2.1-GPTQ
|
[
"region:us"
] |
2023-09-01T03:13:12+00:00
|
{"pretty_name": "Evaluation run of TheBloke/Airoboros-L2-70B-2.1-GPTQ", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Airoboros-L2-70B-2.1-GPTQ](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Airoboros-L2-70B-2.1-GPTQ_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-08T02:26:46.433766](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Airoboros-L2-70B-2.1-GPTQ_public/blob/main/results_2023-11-08T02-26-46.433766.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4241820469798658,\n \"em_stderr\": 0.0050612570385902955,\n \"f1\": 0.5410476090604083,\n \"f1_stderr\": 0.004613044422574753,\n \"acc\": 0.48424459945200166,\n \"acc_stderr\": 0.010393744134050047\n },\n \"harness|drop|3\": {\n \"em\": 0.4241820469798658,\n \"em_stderr\": 0.0050612570385902955,\n \"f1\": 0.5410476090604083,\n \"f1_stderr\": 0.004613044422574753\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15238817285822592,\n \"acc_stderr\": 0.009899572254794198\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.010887916013305896\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_08T02_26_46.433766", "path": ["**/details_harness|drop|3_2023-11-08T02-26-46.433766.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-08T02-26-46.433766.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_08T02_26_46.433766", "path": ["**/details_harness|gsm8k|5_2023-11-08T02-26-46.433766.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-08T02-26-46.433766.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_08T02_26_46.433766", "path": ["**/details_harness|winogrande|5_2023-11-08T02-26-46.433766.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-08T02-26-46.433766.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_08T02_26_46.433766", "path": ["results_2023-11-08T02-26-46.433766.parquet"]}, {"split": "latest", "path": ["results_2023-11-08T02-26-46.433766.parquet"]}]}]}
|
2023-12-01T14:46:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/Airoboros-L2-70B-2.1-GPTQ
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/Airoboros-L2-70B-2.1-GPTQ on the Open LLM Leaderboard.
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-08T02:26:46.433766(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/Airoboros-L2-70B-2.1-GPTQ",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Airoboros-L2-70B-2.1-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-08T02:26:46.433766(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/Airoboros-L2-70B-2.1-GPTQ",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Airoboros-L2-70B-2.1-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-08T02:26:46.433766(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
176,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Airoboros-L2-70B-2.1-GPTQ## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Airoboros-L2-70B-2.1-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-08T02:26:46.433766(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9c3e2547e2fef7d78e255b1bd47f5a0c03ba2a54
|
# Dataset Card for "test1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
kofsitho/test1
|
[
"region:us"
] |
2023-09-01T03:47:51+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "transcription", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24321123.0, "num_examples": 190}], "download_size": 20380642, "dataset_size": 24321123.0}}
|
2023-09-01T03:48:12+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "test1"
More Information needed
|
[
"# Dataset Card for \"test1\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"test1\"\n\nMore Information needed"
] |
[
6,
12
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"test1\"\n\nMore Information needed"
] |
b388e44a752b36d9928dd886c6b65084dc97937f
|
# Dataset Card for Evaluation run of luffycodes/mcq-vicuna-13b-v1.5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/luffycodes/mcq-vicuna-13b-v1.5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [luffycodes/mcq-vicuna-13b-v1.5](https://huggingface.co/luffycodes/mcq-vicuna-13b-v1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__mcq-vicuna-13b-v1.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T06:51:11.600921](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__mcq-vicuna-13b-v1.5/blob/main/results_2023-10-15T06-51-11.600921.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.28366191275167785,
"em_stderr": 0.004616354866148242,
"f1": 0.34618708053691377,
"f1_stderr": 0.004545404408691654,
"acc": 0.40521747299651206,
"acc_stderr": 0.009982345972620842
},
"harness|drop|3": {
"em": 0.28366191275167785,
"em_stderr": 0.004616354866148242,
"f1": 0.34618708053691377,
"f1_stderr": 0.004545404408691654
},
"harness|gsm8k|5": {
"acc": 0.0803639120545868,
"acc_stderr": 0.007488258573239077
},
"harness|winogrande|5": {
"acc": 0.7300710339384373,
"acc_stderr": 0.012476433372002604
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_luffycodes__mcq-vicuna-13b-v1.5
|
[
"region:us"
] |
2023-09-01T04:01:58+00:00
|
{"pretty_name": "Evaluation run of luffycodes/mcq-vicuna-13b-v1.5", "dataset_summary": "Dataset automatically created during the evaluation run of model [luffycodes/mcq-vicuna-13b-v1.5](https://huggingface.co/luffycodes/mcq-vicuna-13b-v1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__mcq-vicuna-13b-v1.5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T06:51:11.600921](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__mcq-vicuna-13b-v1.5/blob/main/results_2023-10-15T06-51-11.600921.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.28366191275167785,\n \"em_stderr\": 0.004616354866148242,\n \"f1\": 0.34618708053691377,\n \"f1_stderr\": 0.004545404408691654,\n \"acc\": 0.40521747299651206,\n \"acc_stderr\": 0.009982345972620842\n },\n \"harness|drop|3\": {\n \"em\": 0.28366191275167785,\n \"em_stderr\": 0.004616354866148242,\n \"f1\": 0.34618708053691377,\n \"f1_stderr\": 0.004545404408691654\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0803639120545868,\n \"acc_stderr\": 0.007488258573239077\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7300710339384373,\n \"acc_stderr\": 0.012476433372002604\n }\n}\n```", "repo_url": "https://huggingface.co/luffycodes/mcq-vicuna-13b-v1.5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|arc:challenge|25_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|arc:challenge|25_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T04_41_01.190569", "path": ["**/details_harness|drop|3_2023-10-13T04-41-01.190569.parquet"]}, {"split": "2023_10_15T06_51_11.600921", "path": ["**/details_harness|drop|3_2023-10-15T06-51-11.600921.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T06-51-11.600921.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T04_41_01.190569", "path": ["**/details_harness|gsm8k|5_2023-10-13T04-41-01.190569.parquet"]}, {"split": "2023_10_15T06_51_11.600921", "path": ["**/details_harness|gsm8k|5_2023-10-15T06-51-11.600921.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T06-51-11.600921.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hellaswag|10_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hellaswag|10_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T05:01:33.006362.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T06:07:11.964362.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T06:07:11.964362.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T06:07:11.964362.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T04_41_01.190569", "path": ["**/details_harness|winogrande|5_2023-10-13T04-41-01.190569.parquet"]}, {"split": "2023_10_15T06_51_11.600921", "path": ["**/details_harness|winogrande|5_2023-10-15T06-51-11.600921.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T06-51-11.600921.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_01T05_01_33.006362", "path": ["results_2023-09-01T05:01:33.006362.parquet"]}, {"split": "2023_09_01T06_07_11.964362", "path": ["results_2023-09-01T06:07:11.964362.parquet"]}, {"split": "2023_10_13T04_41_01.190569", "path": ["results_2023-10-13T04-41-01.190569.parquet"]}, {"split": "2023_10_15T06_51_11.600921", "path": ["results_2023-10-15T06-51-11.600921.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T06-51-11.600921.parquet"]}]}]}
|
2023-10-15T05:51:23+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of luffycodes/mcq-vicuna-13b-v1.5
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model luffycodes/mcq-vicuna-13b-v1.5 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T06:51:11.600921(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of luffycodes/mcq-vicuna-13b-v1.5",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model luffycodes/mcq-vicuna-13b-v1.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T06:51:11.600921(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of luffycodes/mcq-vicuna-13b-v1.5",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model luffycodes/mcq-vicuna-13b-v1.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T06:51:11.600921(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of luffycodes/mcq-vicuna-13b-v1.5## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model luffycodes/mcq-vicuna-13b-v1.5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T06:51:11.600921(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
92c205c276a1ae1efb0f7c382a98e25093dac1fd
|
# Dataset Card for "result_with_w2v2_baseline_aug"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
quocanh34/result_with_w2v2_baseline_aug
|
[
"region:us"
] |
2023-09-01T04:09:49+00:00
|
{"dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "id", "dtype": "string"}, {"name": "w2v2_baseline_transcription", "dtype": "string"}, {"name": "w2v2_baseline_norm", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 174371756.027, "num_examples": 1299}], "download_size": 164200794, "dataset_size": 174371756.027}}
|
2023-09-01T04:09:55+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "result_with_w2v2_baseline_aug"
More Information needed
|
[
"# Dataset Card for \"result_with_w2v2_baseline_aug\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"result_with_w2v2_baseline_aug\"\n\nMore Information needed"
] |
[
6,
24
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"result_with_w2v2_baseline_aug\"\n\nMore Information needed"
] |
0ecd3b24ad8b9bacece027ffacd1ef2c834f3118
|
# Dataset Card for "cpgQA-v1.0-unique-context-for-flan-t5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
minh21/cpgQA-v1.0-unique-context-for-flan-t5
|
[
"region:us"
] |
2023-09-01T04:27:09+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "title", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "question", "dtype": "string"}, {"name": "answer_text", "dtype": "string"}, {"name": "answer_start", "dtype": "int64"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1132786.0440713535, "num_examples": 860}, {"name": "test", "num_bytes": 180144.0, "num_examples": 144}], "download_size": 29642, "dataset_size": 1312930.0440713535}}
|
2023-09-01T04:37:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "cpgQA-v1.0-unique-context-for-flan-t5"
More Information needed
|
[
"# Dataset Card for \"cpgQA-v1.0-unique-context-for-flan-t5\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"cpgQA-v1.0-unique-context-for-flan-t5\"\n\nMore Information needed"
] |
[
6,
29
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"cpgQA-v1.0-unique-context-for-flan-t5\"\n\nMore Information needed"
] |
f044c681f462188e4d8b829d992bec2d43aeff4b
|
# Dataset Card for "phomt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
hieunguyen1053/phomt-filtered
|
[
"region:us"
] |
2023-09-01T04:41:26+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "vi", "dtype": "string"}, {"name": "en", "dtype": "string"}, {"name": "loss", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 560715693, "num_examples": 2977999}], "download_size": 337506156, "dataset_size": 560715693}}
|
2023-09-01T04:43:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "phomt"
More Information needed
|
[
"# Dataset Card for \"phomt\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"phomt\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"phomt\"\n\nMore Information needed"
] |
04a3ade0c000c9ba58f08e044f7e668f36728f06
|
# Dataset of 02
This is the dataset of 02, containing 199 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 199 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 430 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 199 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 199 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 199 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 199 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 199 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 430 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 430 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 430 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/02_darlinginthefranxx
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-01T04:48:14+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2023-09-17T16:27:07+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of 02
=============
This is the dataset of 02, containing 199 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
|
[] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
0d3016c344f1c4cce17380bbcfa8a7cc87f549b9
|
# Dataset Card for Evaluation run of uukuguy/speechless-llama2-luban-orca-platypus-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-llama2-luban-orca-platypus-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-llama2-luban-orca-platypus-13b](https://huggingface.co/uukuguy/speechless-llama2-luban-orca-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-llama2-luban-orca-platypus-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T17:51:55.747438](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-llama2-luban-orca-platypus-13b/blob/main/results_2023-10-16T17-51-55.747438.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006921140939597316,
"em_stderr": 0.0008490247804930292,
"f1": 0.11193687080536992,
"f1_stderr": 0.0020523308364626394,
"acc": 0.4264965386587744,
"acc_stderr": 0.009679849375871168
},
"harness|drop|3": {
"em": 0.006921140939597316,
"em_stderr": 0.0008490247804930292,
"f1": 0.11193687080536992,
"f1_stderr": 0.0020523308364626394
},
"harness|gsm8k|5": {
"acc": 0.08188021228203184,
"acc_stderr": 0.007552338527716947
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025388
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_uukuguy__speechless-llama2-luban-orca-platypus-13b
|
[
"region:us"
] |
2023-09-01T04:55:07+00:00
|
{"pretty_name": "Evaluation run of uukuguy/speechless-llama2-luban-orca-platypus-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-llama2-luban-orca-platypus-13b](https://huggingface.co/uukuguy/speechless-llama2-luban-orca-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-llama2-luban-orca-platypus-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T17:51:55.747438](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-llama2-luban-orca-platypus-13b/blob/main/results_2023-10-16T17-51-55.747438.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006921140939597316,\n \"em_stderr\": 0.0008490247804930292,\n \"f1\": 0.11193687080536992,\n \"f1_stderr\": 0.0020523308364626394,\n \"acc\": 0.4264965386587744,\n \"acc_stderr\": 0.009679849375871168\n },\n \"harness|drop|3\": {\n \"em\": 0.006921140939597316,\n \"em_stderr\": 0.0008490247804930292,\n \"f1\": 0.11193687080536992,\n \"f1_stderr\": 0.0020523308364626394\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08188021228203184,\n \"acc_stderr\": 0.007552338527716947\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025388\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-llama2-luban-orca-platypus-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|arc:challenge|25_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T17_51_55.747438", "path": ["**/details_harness|drop|3_2023-10-16T17-51-55.747438.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T17-51-55.747438.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T17_51_55.747438", "path": ["**/details_harness|gsm8k|5_2023-10-16T17-51-55.747438.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T17-51-55.747438.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hellaswag|10_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T05:54:43.169153.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T05:54:43.169153.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T05:54:43.169153.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T17_51_55.747438", "path": ["**/details_harness|winogrande|5_2023-10-16T17-51-55.747438.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T17-51-55.747438.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_01T05_54_43.169153", "path": ["results_2023-09-01T05:54:43.169153.parquet"]}, {"split": "2023_10_16T17_51_55.747438", "path": ["results_2023-10-16T17-51-55.747438.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T17-51-55.747438.parquet"]}]}]}
|
2023-10-16T16:52:08+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of uukuguy/speechless-llama2-luban-orca-platypus-13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model uukuguy/speechless-llama2-luban-orca-platypus-13b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-16T17:51:55.747438(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of uukuguy/speechless-llama2-luban-orca-platypus-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-llama2-luban-orca-platypus-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T17:51:55.747438(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of uukuguy/speechless-llama2-luban-orca-platypus-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-llama2-luban-orca-platypus-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T17:51:55.747438(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
32,
31,
180,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/speechless-llama2-luban-orca-platypus-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-llama2-luban-orca-platypus-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T17:51:55.747438(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a9161b50803da0114dfd808912158d802725dfa4
|
以下からダウンロードしました。
https://huggingface.co/datasets/takaaki-inada/databricks-dolly-15k-ja-zundamon/tree/main
|
ToPo-ToPo/databricks-dolly-15k-ja-zundamon
|
[
"region:us"
] |
2023-09-01T05:11:00+00:00
|
{}
|
2023-09-01T05:13:44+00:00
|
[] |
[] |
TAGS
#region-us
|
以下からダウンロードしました。
URL
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
c1c96105710ddb5f90dd694ebab5a344aa89e147
|
# Dataset of MIKU
This is the dataset of MIKU, containing 99 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 99 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 214 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 99 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 99 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 99 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 99 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 99 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 214 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 214 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 214 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/miku_darlinginthefranxx
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-01T05:15:27+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2023-09-17T16:27:09+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of MIKU
===============
This is the dataset of MIKU, containing 99 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
|
[] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
ad79c9ef91835353ad5000909a054065675c65f4
|
# Dataset of ICHIGO
This is the dataset of ICHIGO, containing 187 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 187 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 383 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 187 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 187 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 187 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 187 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 187 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 383 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 383 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 383 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/ichigo_darlinginthefranxx
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-01T05:25:05+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2023-09-17T16:27:11+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of ICHIGO
=================
This is the dataset of ICHIGO, containing 187 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
|
[] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
901382e7f43108d31ae916063d43e74a09f42892
|
# Dataset of ikuno_darlinginthefranxx
This is the dataset of ikuno_darlinginthefranxx, containing 62 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 62 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 135 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 62 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 62 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 62 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 62 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 62 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 135 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 135 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 135 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/ikuno_darlinginthefranxx
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-01T05:31:14+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2023-09-17T16:27:13+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of ikuno\_darlinginthefranxx
====================================
This is the dataset of ikuno\_darlinginthefranxx, containing 62 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
|
[] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
755aea1e666fb6441fb1a9a64d3660b8459632db
|
# Dataset of KOKORO
This is the dataset of KOKORO, containing 129 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 129 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 273 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 129 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 129 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 129 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 129 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 129 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 273 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 273 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 273 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/kokoro_darlinginthefranxx
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-01T05:42:54+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2023-09-17T16:27:15+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of KOKORO
=================
This is the dataset of KOKORO, containing 129 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
|
[] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
202d642606f82aa1e69bbf8ce53880d293bd6d71
|
# Dataset of NANA
This is the dataset of NANA, containing 76 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 76 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 145 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 76 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 76 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 76 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 76 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 76 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 145 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 145 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 145 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/nana_darlinginthefranxx
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-01T05:49:10+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2023-09-17T16:27:17+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of NANA
===============
This is the dataset of NANA, containing 76 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
|
[] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
b2812282a7401fa892b90bb1eebc06634d389841
|
# Dataset Card for AnatEM
## Dataset Description
- **Homepage:** http://nactem.ac.uk/anatomytagger/#AnatEM
- **Pubmed:** True
- **Public:** True
- **Tasks:** NER
The extended Anatomical Entity Mention corpus (AnatEM) consists of 1212 documents (approx. 250,000 words) manually annotated to identify over 13,000 mentions of anatomical entities. Each annotation is assigned one of 12 granularity-based types such as Cellular component, Tissue and Organ, defined with reference to the Common Anatomy Reference Ontology.
## Citation Information
```
@article{pyysalo2014anatomical,
title={Anatomical entity mention recognition at literature scale},
author={Pyysalo, Sampo and Ananiadou, Sophia},
journal={Bioinformatics},
volume={30},
number={6},
pages={868--875},
year={2014},
publisher={Oxford University Press}
}
```
|
masaenger/anat_em
|
[
"multilinguality:monolingual",
"language:en",
"license:cc-by-sa-3.0",
"region:us"
] |
2023-09-01T06:03:47+00:00
|
{"language": ["en"], "license": "cc-by-sa-3.0", "multilinguality": "monolingual", "pretty_name": "AnatEM", "bigbio_language": ["English"], "bigbio_license_shortname": "CC_BY_SA_3p0", "homepage": "http://nactem.ac.uk/anatomytagger/#AnatEM", "bigbio_pubmed": true, "bigbio_public": true, "bigbio_tasks": ["NAMED_ENTITY_RECOGNITION"]}
|
2023-09-01T06:06:12+00:00
|
[] |
[
"en"
] |
TAGS
#multilinguality-monolingual #language-English #license-cc-by-sa-3.0 #region-us
|
# Dataset Card for AnatEM
## Dataset Description
- Homepage: URL
- Pubmed: True
- Public: True
- Tasks: NER
The extended Anatomical Entity Mention corpus (AnatEM) consists of 1212 documents (approx. 250,000 words) manually annotated to identify over 13,000 mentions of anatomical entities. Each annotation is assigned one of 12 granularity-based types such as Cellular component, Tissue and Organ, defined with reference to the Common Anatomy Reference Ontology.
|
[
"# Dataset Card for AnatEM",
"## Dataset Description\n\n- Homepage: URL\n- Pubmed: True\n- Public: True\n- Tasks: NER\n\n\nThe extended Anatomical Entity Mention corpus (AnatEM) consists of 1212 documents (approx. 250,000 words) manually annotated to identify over 13,000 mentions of anatomical entities. Each annotation is assigned one of 12 granularity-based types such as Cellular component, Tissue and Organ, defined with reference to the Common Anatomy Reference Ontology."
] |
[
"TAGS\n#multilinguality-monolingual #language-English #license-cc-by-sa-3.0 #region-us \n",
"# Dataset Card for AnatEM",
"## Dataset Description\n\n- Homepage: URL\n- Pubmed: True\n- Public: True\n- Tasks: NER\n\n\nThe extended Anatomical Entity Mention corpus (AnatEM) consists of 1212 documents (approx. 250,000 words) manually annotated to identify over 13,000 mentions of anatomical entities. Each annotation is assigned one of 12 granularity-based types such as Cellular component, Tissue and Organ, defined with reference to the Common Anatomy Reference Ontology."
] |
[
29,
8,
112
] |
[
"passage: TAGS\n#multilinguality-monolingual #language-English #license-cc-by-sa-3.0 #region-us \n# Dataset Card for AnatEM## Dataset Description\n\n- Homepage: URL\n- Pubmed: True\n- Public: True\n- Tasks: NER\n\n\nThe extended Anatomical Entity Mention corpus (AnatEM) consists of 1212 documents (approx. 250,000 words) manually annotated to identify over 13,000 mentions of anatomical entities. Each annotation is assigned one of 12 granularity-based types such as Cellular component, Tissue and Organ, defined with reference to the Common Anatomy Reference Ontology."
] |
384e16466095aeee46f8f6f5a8e0f44d87c77f15
|
# Dataset Card for "ancient_city_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Falah/ancient_city_prompts_SDXL
|
[
"region:us"
] |
2023-09-01T06:04:52+00:00
|
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 781381246, "num_examples": 1000000}], "download_size": 85390587, "dataset_size": 781381246}}
|
2023-09-01T06:04:56+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "ancient_city_prompts_SDXL"
More Information needed
|
[
"# Dataset Card for \"ancient_city_prompts_SDXL\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"ancient_city_prompts_SDXL\"\n\nMore Information needed"
] |
[
6,
21
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"ancient_city_prompts_SDXL\"\n\nMore Information needed"
] |
ecbcb1b9050beefcf2638542a1d2e61a93bff206
|
# Dataset Card for "proba_dataset-3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
terhdavid/proba_dataset-3
|
[
"region:us"
] |
2023-09-01T06:08:40+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "ner", "sequence": {"class_label": {"names": {"0": "O", "1": "B-ORG", "2": "I-ORG"}}}}], "splits": [{"name": "train", "num_bytes": 143190.77989130435, "num_examples": 662}, {"name": "test", "num_bytes": 16006.220108695652, "num_examples": 74}, {"name": "validation", "num_bytes": 16006.220108695652, "num_examples": 74}], "download_size": 35415, "dataset_size": 175203.22010869565}}
|
2023-09-01T06:08:43+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "proba_dataset-3"
More Information needed
|
[
"# Dataset Card for \"proba_dataset-3\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"proba_dataset-3\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"proba_dataset-3\"\n\nMore Information needed"
] |
cd22936bc2d7962af1353de0e9113c81e099741b
|
# Dataset Card for "modern_architectural_style_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Falah/modern_architectural_style_prompts_SDXL
|
[
"region:us"
] |
2023-09-01T06:13:46+00:00
|
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 538167526, "num_examples": 1000000}], "download_size": 63348311, "dataset_size": 538167526}}
|
2023-09-01T06:13:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "modern_architectural_style_prompts_SDXL"
More Information needed
|
[
"# Dataset Card for \"modern_architectural_style_prompts_SDXL\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"modern_architectural_style_prompts_SDXL\"\n\nMore Information needed"
] |
[
6,
25
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"modern_architectural_style_prompts_SDXL\"\n\nMore Information needed"
] |
78e0c68dd995521e4b123adce50337726078725c
|
This dataset contains question/answer pairs from a French home insurance (MRH: Multi-Risk Home Insurance).
It comes from structuring the following open sources:
- https://www.mma.fr/assurance-habitation.html
- https://cap.mma.fr/files/live/sites/mmafr/files/documents-cg/cg410/Habitation_MMA_410p.pdf
The objective of this dataset is to contribute to open source research projects aiming to, for instance:
* fine-tune LLMs on high-quality datasets, specializing them in the insurance domain
* develop new question/answer applications using Retrieval Augmented Generation (RAG) for insurance contracts
* assess the knowledge of language models in the insurance field
* more generally, apply LLMs to the insurance domain for better understanding and increased transparency of this industry.
Other datasets of the same kind (but on other types of insurance, other languages, or from different sources) are also available - or will be available soon - and are part of this research effort.
|
zelros/insurance-fr
|
[
"insurance",
"region:us"
] |
2023-09-01T06:16:26+00:00
|
{"tags": ["insurance"]}
|
2023-10-21T12:19:38+00:00
|
[] |
[] |
TAGS
#insurance #region-us
|
This dataset contains question/answer pairs from a French home insurance (MRH: Multi-Risk Home Insurance).
It comes from structuring the following open sources:
- URL
- URL
The objective of this dataset is to contribute to open source research projects aiming to, for instance:
* fine-tune LLMs on high-quality datasets, specializing them in the insurance domain
* develop new question/answer applications using Retrieval Augmented Generation (RAG) for insurance contracts
* assess the knowledge of language models in the insurance field
* more generally, apply LLMs to the insurance domain for better understanding and increased transparency of this industry.
Other datasets of the same kind (but on other types of insurance, other languages, or from different sources) are also available - or will be available soon - and are part of this research effort.
|
[] |
[
"TAGS\n#insurance #region-us \n"
] |
[
9
] |
[
"passage: TAGS\n#insurance #region-us \n"
] |
b7cd1bd70b07c165e00bcddc47d95775eef2b3e7
|
This dataset can also be found here: https://www.workwithdata.com/dataset?entity=politicians
---
license: cc-by-4.0
---
|
WorkWithData/politicians
|
[
"region:us"
] |
2023-09-01T06:16:27+00:00
|
{}
|
2023-09-01T06:18:17+00:00
|
[] |
[] |
TAGS
#region-us
|
This dataset can also be found here: URL
---
license: cc-by-4.0
---
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
d4788d1450a58d03d7ea0bd7922bd530db9a32b4
|
# Dataset Card for "ner-company-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
terhdavid/ner-company-dataset
|
[
"region:us"
] |
2023-09-01T06:30:44+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "ner", "sequence": {"class_label": {"names": {"0": "O", "1": "B-ORG", "2": "I-ORG"}}}}], "splits": [{"name": "train", "num_bytes": 111394.63994565218, "num_examples": 515}, {"name": "test", "num_bytes": 47802.360054347824, "num_examples": 221}, {"name": "validation", "num_bytes": 47802.360054347824, "num_examples": 221}], "download_size": 41876, "dataset_size": 206999.36005434784}}
|
2023-09-01T07:01:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "ner-company-dataset"
More Information needed
|
[
"# Dataset Card for \"ner-company-dataset\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"ner-company-dataset\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"ner-company-dataset\"\n\nMore Information needed"
] |
9d6a2401a705f009be1450aa68d65b17463aa3ff
|
# Dataset Card for "heise_ds_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Yaasr/heise_ds_split
|
[
"region:us"
] |
2023-09-01T06:44:53+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "summary", "dtype": "string"}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 157826571.87536153, "num_examples": 53487}, {"name": "test", "num_bytes": 27852095.124638464, "num_examples": 9439}], "download_size": 116864966, "dataset_size": 185678667.0}}
|
2023-09-01T06:46:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "heise_ds_split"
More Information needed
|
[
"# Dataset Card for \"heise_ds_split\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"heise_ds_split\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"heise_ds_split\"\n\nMore Information needed"
] |
7b2d120b8fde590ea584054a2257a86946226927
|
# 🇪🇺 🏷️ EuroVoc dataset
This dataset contains more that 3,700,000 documents in 39 languages with associated EuroVoc labels.
## What's Cellar ?
Cellar is the common data repository of the Publications Office of the European Union. Digital publications and metadata are stored in and disseminated via Cellar, in order to be used by humans and machines. Aiming to transparently serve users, Cellar stores multilingual publications and metadata, it is open to all EU citizens and provides machine-readable data.
https://op.europa.eu/fr/web/cellar
## Why was this dataset created ?
"Extreme classification come with challenges of scalability due to large label spaces, data sparsity issues due to insufficient training samples."
https://medium.com/datapy-ai/extreme-multi-label-classification-for-eurovoc-b51d74623820
## How was dataset this created ?
The source code is available, check `cellar.py`
## When this dataset was created ?
14 July 2023
## What are the main characteristics of this dataset ?
There are a total of 39 different languages present in this dataset, of which some are EU languages and some are not. As the following graph illustrates, most of the documents of the dataset are written in EU languages (English being the most present language in the dataset), and the non-EU languages are very poorly represented (for example Arabic, Japanese,...). Note that since the Irish language (`gle`) was granted full official and working status in the EU in 2022, there are very few documents in that language. Additionally, Croatian (`hrv`) is also less represented in the dataset as Croatia is the latest country to have joined the EU in 2013.

The lengths of the documents also varies depending on the language it is written in. The document lengths are quite variable, especially in English. There is therefore a quite large disparity in document lengths in this dataset. Note that this boxplot does not present the outliers, since the length of certain documents can contain up to 86 million characters. The red lines in the boxplot indicates the median length of the documents for each language.

We notice that the documents in Irish have a very wide variability in document lengths, due to the fact it has very few documents. Therefore, we present the same boxplot without the Irish language in order to visualize with more detail the document length distribution in the other languages.

## How is the data structured ?
An example of a sample of this dataset is the following :
```json
{
"title": "Commission information notice...",
"date": "2023-09-29",
"eurovoc_concepts": ["air transport", "intra-EU transport"],
"url": "http://publications.europa.eu/resource/cellar/ec99987f-5e69-11ee-9220-01aa75ed71a1",
"lang": "eng",
"formats": ["fmx4", "pdfa2a", "xhtml"],
"text": "To ensure ownership by the relevant actors,..."
}
```
- `title` : title of the document
- `date` : publication date of the document
- `eurovoc_concepts` : list of the EuroVoc concepts related to this document
- `url` : URL to access the document
- `formats` : list of formats in which the original document is available
- `text` : text content of the document
## Bibliography
- Ilias Chalkidis, Emmanouil Fergadiotis, Prodromos Malakasiotis, Nikolaos Aletras, and Ion Androutsopoulos. 2019. Extreme Multi-Label Legal Text Classification: A Case Study in EU Legislation. In Proceedings of the Natural Legal Language Processing Workshop 2019, pages 78–87, Minneapolis, Minnesota. Association for Computational Linguistics.
- I. Chalkidis, M. Fergadiotis, P. Malakasiotis and I. Androutsopoulos, Large-Scale Multi-Label Text Classification on EU Legislation. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019), Florence, Italy, (short papers), 2019.
- Andrei-Marius Avram, Vasile Pais, and Dan Ioan Tufis. 2021. PyEuroVoc: A Tool for Multilingual Legal Document Classification with EuroVoc Descriptors. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 92–101, Held Online. INCOMA Ltd..
- SHAHEEN, Zein, WOHLGENANNT, Gerhard, et FILTZ, Erwin. Large scale legal text classification using transformer models. arXiv preprint arXiv:2010.12871, 2020.
## Author(s)
Sébastien Campion <[email protected]>
|
EuropeanParliament/Eurovoc
|
[
"license:eupl-1.1",
"region:us"
] |
2023-09-01T06:46:44+00:00
|
{"license": "eupl-1.1", "configs": [{"config_name": "1996-03", "data_files": "files/1996-03.jsonl.gz"}, {"config_name": "1996-04", "data_files": "files/1996-04.jsonl.gz"}, {"config_name": "1996-05", "data_files": "files/1996-05.jsonl.gz"}, {"config_name": "1996-06", "data_files": "files/1996-06.jsonl.gz"}, {"config_name": "1996-07", "data_files": "files/1996-07.jsonl.gz"}, {"config_name": "1996-08", "data_files": "files/1996-08.jsonl.gz"}, {"config_name": "1996-09", "data_files": "files/1996-09.jsonl.gz"}, {"config_name": "1996-10", "data_files": "files/1996-10.jsonl.gz"}, {"config_name": "1996-11", "data_files": "files/1996-11.jsonl.gz"}, {"config_name": "1996-12", "data_files": "files/1996-12.jsonl.gz"}, {"config_name": "1997-01", "data_files": "files/1997-01.jsonl.gz"}, {"config_name": "1997-02", "data_files": "files/1997-02.jsonl.gz"}, {"config_name": "1997-03", "data_files": "files/1997-03.jsonl.gz"}, {"config_name": "1997-04", "data_files": "files/1997-04.jsonl.gz"}, {"config_name": "1997-05", "data_files": "files/1997-05.jsonl.gz"}, {"config_name": "1997-06", "data_files": "files/1997-06.jsonl.gz"}, {"config_name": "1997-07", "data_files": "files/1997-07.jsonl.gz"}, {"config_name": "1997-08", "data_files": "files/1997-08.jsonl.gz"}, {"config_name": "1997-09", "data_files": "files/1997-09.jsonl.gz"}, {"config_name": "1997-10", "data_files": "files/1997-10.jsonl.gz"}, {"config_name": "1997-11", "data_files": "files/1997-11.jsonl.gz"}, {"config_name": "1997-12", "data_files": "files/1997-12.jsonl.gz"}, {"config_name": "1998-01", "data_files": "files/1998-01.jsonl.gz"}, {"config_name": "1998-02", "data_files": "files/1998-02.jsonl.gz"}, {"config_name": "1998-03", "data_files": "files/1998-03.jsonl.gz"}, {"config_name": "1998-04", "data_files": "files/1998-04.jsonl.gz"}, {"config_name": "1998-05", "data_files": "files/1998-05.jsonl.gz"}, {"config_name": "1998-06", "data_files": "files/1998-06.jsonl.gz"}, {"config_name": "1998-07", "data_files": "files/1998-07.jsonl.gz"}, {"config_name": "1998-08", "data_files": "files/1998-08.jsonl.gz"}, {"config_name": "1998-09", "data_files": "files/1998-09.jsonl.gz"}, {"config_name": "1998-10", "data_files": "files/1998-10.jsonl.gz"}, {"config_name": "1998-11", "data_files": "files/1998-11.jsonl.gz"}, {"config_name": "1998-12", "data_files": "files/1998-12.jsonl.gz"}, {"config_name": "1999-01", "data_files": "files/1999-01.jsonl.gz"}, {"config_name": "1999-02", "data_files": "files/1999-02.jsonl.gz"}, {"config_name": "1999-03", "data_files": "files/1999-03.jsonl.gz"}, {"config_name": "1999-04", "data_files": "files/1999-04.jsonl.gz"}, {"config_name": "1999-05", "data_files": "files/1999-05.jsonl.gz"}, {"config_name": "1999-06", "data_files": "files/1999-06.jsonl.gz"}, {"config_name": "1999-07", "data_files": "files/1999-07.jsonl.gz"}, {"config_name": "1999-08", "data_files": "files/1999-08.jsonl.gz"}, {"config_name": "1999-09", "data_files": "files/1999-09.jsonl.gz"}, {"config_name": "1999-10", "data_files": "files/1999-10.jsonl.gz"}, {"config_name": "1999-11", "data_files": "files/1999-11.jsonl.gz"}, {"config_name": "1999-12", "data_files": "files/1999-12.jsonl.gz"}, {"config_name": "2000-01", "data_files": "files/2000-01.jsonl.gz"}, {"config_name": "2000-02", "data_files": "files/2000-02.jsonl.gz"}, {"config_name": "2000-03", "data_files": "files/2000-03.jsonl.gz"}, {"config_name": "2000-04", "data_files": "files/2000-04.jsonl.gz"}, {"config_name": "2000-05", "data_files": "files/2000-05.jsonl.gz"}, {"config_name": "2000-06", "data_files": "files/2000-06.jsonl.gz"}, {"config_name": "2000-07", "data_files": "files/2000-07.jsonl.gz"}, {"config_name": "2000-08", "data_files": "files/2000-08.jsonl.gz"}, {"config_name": "2000-09", "data_files": "files/2000-09.jsonl.gz"}, {"config_name": "2000-10", "data_files": "files/2000-10.jsonl.gz"}, {"config_name": "2000-11", "data_files": "files/2000-11.jsonl.gz"}, {"config_name": "2000-12", "data_files": "files/2000-12.jsonl.gz"}, {"config_name": "2001-01", "data_files": "files/2001-01.jsonl.gz"}, {"config_name": "2001-02", "data_files": "files/2001-02.jsonl.gz"}, {"config_name": "2001-03", "data_files": "files/2001-03.jsonl.gz"}, {"config_name": "2001-04", "data_files": "files/2001-04.jsonl.gz"}, {"config_name": "2001-05", "data_files": "files/2001-05.jsonl.gz"}, {"config_name": "2001-06", "data_files": "files/2001-06.jsonl.gz"}, {"config_name": "2001-07", "data_files": "files/2001-07.jsonl.gz"}, {"config_name": "2001-08", "data_files": "files/2001-08.jsonl.gz"}, {"config_name": "2001-09", "data_files": "files/2001-09.jsonl.gz"}, {"config_name": "2001-10", "data_files": "files/2001-10.jsonl.gz"}, {"config_name": "2001-11", "data_files": "files/2001-11.jsonl.gz"}, {"config_name": "2001-12", "data_files": "files/2001-12.jsonl.gz"}, {"config_name": "2002-01", "data_files": "files/2002-01.jsonl.gz"}, {"config_name": "2002-02", "data_files": "files/2002-02.jsonl.gz"}, {"config_name": "2002-03", "data_files": "files/2002-03.jsonl.gz"}, {"config_name": "2002-04", "data_files": "files/2002-04.jsonl.gz"}, {"config_name": "2002-05", "data_files": "files/2002-05.jsonl.gz"}, {"config_name": "2002-06", "data_files": "files/2002-06.jsonl.gz"}, {"config_name": "2002-07", "data_files": "files/2002-07.jsonl.gz"}, {"config_name": "2002-08", "data_files": "files/2002-08.jsonl.gz"}, {"config_name": "2002-09", "data_files": "files/2002-09.jsonl.gz"}, {"config_name": "2002-10", "data_files": "files/2002-10.jsonl.gz"}, {"config_name": "2002-11", "data_files": "files/2002-11.jsonl.gz"}, {"config_name": "2002-12", "data_files": "files/2002-12.jsonl.gz"}, {"config_name": "2003-01", "data_files": "files/2003-01.jsonl.gz"}, {"config_name": "2003-02", "data_files": "files/2003-02.jsonl.gz"}, {"config_name": "2003-03", "data_files": "files/2003-03.jsonl.gz"}, {"config_name": "2003-04", "data_files": "files/2003-04.jsonl.gz"}, {"config_name": "2003-05", "data_files": "files/2003-05.jsonl.gz"}, {"config_name": "2003-06", "data_files": "files/2003-06.jsonl.gz"}, {"config_name": "2003-07", "data_files": "files/2003-07.jsonl.gz"}, {"config_name": "2003-08", "data_files": "files/2003-08.jsonl.gz"}, {"config_name": "2003-09", "data_files": "files/2003-09.jsonl.gz"}, {"config_name": "2003-10", "data_files": "files/2003-10.jsonl.gz"}, {"config_name": "2003-11", "data_files": "files/2003-11.jsonl.gz"}, {"config_name": "2003-12", "data_files": "files/2003-12.jsonl.gz"}, {"config_name": "2004-01", "data_files": "files/2004-01.jsonl.gz"}, {"config_name": "2004-02", "data_files": "files/2004-02.jsonl.gz"}, {"config_name": "2004-03", "data_files": "files/2004-03.jsonl.gz"}, {"config_name": "2004-04", "data_files": "files/2004-04.jsonl.gz"}, {"config_name": "2004-05", "data_files": "files/2004-05.jsonl.gz"}, {"config_name": "2004-06", "data_files": "files/2004-06.jsonl.gz"}, {"config_name": "2004-07", "data_files": "files/2004-07.jsonl.gz"}, {"config_name": "2004-08", "data_files": "files/2004-08.jsonl.gz"}, {"config_name": "2004-09", "data_files": "files/2004-09.jsonl.gz"}, {"config_name": "2004-10", "data_files": "files/2004-10.jsonl.gz"}, {"config_name": "2004-11", "data_files": "files/2004-11.jsonl.gz"}, {"config_name": "2004-12", "data_files": "files/2004-12.jsonl.gz"}, {"config_name": "2005-01", "data_files": "files/2005-01.jsonl.gz"}, {"config_name": "2005-02", "data_files": "files/2005-02.jsonl.gz"}, {"config_name": "2005-03", "data_files": "files/2005-03.jsonl.gz"}, {"config_name": "2005-04", "data_files": "files/2005-04.jsonl.gz"}, {"config_name": "2005-05", "data_files": "files/2005-05.jsonl.gz"}, {"config_name": "2005-06", "data_files": "files/2005-06.jsonl.gz"}, {"config_name": "2005-07", "data_files": "files/2005-07.jsonl.gz"}, {"config_name": "2005-08", "data_files": "files/2005-08.jsonl.gz"}, {"config_name": "2005-09", "data_files": "files/2005-09.jsonl.gz"}, {"config_name": "2005-10", "data_files": "files/2005-10.jsonl.gz"}, {"config_name": "2005-11", "data_files": "files/2005-11.jsonl.gz"}, {"config_name": "2005-12", "data_files": "files/2005-12.jsonl.gz"}, {"config_name": "2006-01", "data_files": "files/2006-01.jsonl.gz"}, {"config_name": "2006-02", "data_files": "files/2006-02.jsonl.gz"}, {"config_name": "2006-03", "data_files": "files/2006-03.jsonl.gz"}, {"config_name": "2006-04", "data_files": "files/2006-04.jsonl.gz"}, {"config_name": "2006-05", "data_files": "files/2006-05.jsonl.gz"}, {"config_name": "2006-06", "data_files": "files/2006-06.jsonl.gz"}, {"config_name": "2006-07", "data_files": "files/2006-07.jsonl.gz"}, {"config_name": "2006-08", "data_files": "files/2006-08.jsonl.gz"}, {"config_name": "2006-09", "data_files": "files/2006-09.jsonl.gz"}, {"config_name": "2006-10", "data_files": "files/2006-10.jsonl.gz"}, {"config_name": "2006-11", "data_files": "files/2006-11.jsonl.gz"}, {"config_name": "2006-12", "data_files": "files/2006-12.jsonl.gz"}, {"config_name": "2007-01", "data_files": "files/2007-01.jsonl.gz"}, {"config_name": "2007-02", "data_files": "files/2007-02.jsonl.gz"}, {"config_name": "2007-03", "data_files": "files/2007-03.jsonl.gz"}, {"config_name": "2007-04", "data_files": "files/2007-04.jsonl.gz"}, {"config_name": "2007-05", "data_files": "files/2007-05.jsonl.gz"}, {"config_name": "2007-06", "data_files": "files/2007-06.jsonl.gz"}, {"config_name": "2007-07", "data_files": "files/2007-07.jsonl.gz"}, {"config_name": "2007-08", "data_files": "files/2007-08.jsonl.gz"}, {"config_name": "2007-09", "data_files": "files/2007-09.jsonl.gz"}, {"config_name": "2007-10", "data_files": "files/2007-10.jsonl.gz"}, {"config_name": "2007-11", "data_files": "files/2007-11.jsonl.gz"}, {"config_name": "2007-12", "data_files": "files/2007-12.jsonl.gz"}, {"config_name": "2008-01", "data_files": "files/2008-01.jsonl.gz"}, {"config_name": "2008-02", "data_files": "files/2008-02.jsonl.gz"}, {"config_name": "2008-03", "data_files": "files/2008-03.jsonl.gz"}, {"config_name": "2008-04", "data_files": "files/2008-04.jsonl.gz"}, {"config_name": "2008-05", "data_files": "files/2008-05.jsonl.gz"}, {"config_name": "2008-06", "data_files": "files/2008-06.jsonl.gz"}, {"config_name": "2008-07", "data_files": "files/2008-07.jsonl.gz"}, {"config_name": "2008-08", "data_files": "files/2008-08.jsonl.gz"}, {"config_name": "2008-09", "data_files": "files/2008-09.jsonl.gz"}, {"config_name": "2008-10", "data_files": "files/2008-10.jsonl.gz"}, {"config_name": "2008-11", "data_files": "files/2008-11.jsonl.gz"}, {"config_name": "2008-12", "data_files": "files/2008-12.jsonl.gz"}, {"config_name": "2009-01", "data_files": "files/2009-01.jsonl.gz"}, {"config_name": "2009-02", "data_files": "files/2009-02.jsonl.gz"}, {"config_name": "2009-03", "data_files": "files/2009-03.jsonl.gz"}, {"config_name": "2009-04", "data_files": "files/2009-04.jsonl.gz"}, {"config_name": "2009-05", "data_files": "files/2009-05.jsonl.gz"}, {"config_name": "2009-06", "data_files": "files/2009-06.jsonl.gz"}, {"config_name": "2009-07", "data_files": "files/2009-07.jsonl.gz"}, {"config_name": "2009-08", "data_files": "files/2009-08.jsonl.gz"}, {"config_name": "2009-09", "data_files": "files/2009-09.jsonl.gz"}, {"config_name": "2009-10", "data_files": "files/2009-10.jsonl.gz"}, {"config_name": "2009-11", "data_files": "files/2009-11.jsonl.gz"}, {"config_name": "2009-12", "data_files": "files/2009-12.jsonl.gz"}, {"config_name": "2010-01", "data_files": "files/2010-01.jsonl.gz"}, {"config_name": "2010-02", "data_files": "files/2010-02.jsonl.gz"}, {"config_name": "2010-03", "data_files": "files/2010-03.jsonl.gz"}, {"config_name": "2010-04", "data_files": "files/2010-04.jsonl.gz"}, {"config_name": "2010-05", "data_files": "files/2010-05.jsonl.gz"}, {"config_name": "2010-06", "data_files": "files/2010-06.jsonl.gz"}, {"config_name": "2010-07", "data_files": "files/2010-07.jsonl.gz"}, {"config_name": "2010-08", "data_files": "files/2010-08.jsonl.gz"}, {"config_name": "2010-09", "data_files": "files/2010-09.jsonl.gz"}, {"config_name": "2010-10", "data_files": "files/2010-10.jsonl.gz"}, {"config_name": "2010-11", "data_files": "files/2010-11.jsonl.gz"}, {"config_name": "2010-12", "data_files": "files/2010-12.jsonl.gz"}, {"config_name": "2011-01", "data_files": "files/2011-01.jsonl.gz"}, {"config_name": "2011-02", "data_files": "files/2011-02.jsonl.gz"}, {"config_name": "2011-03", "data_files": "files/2011-03.jsonl.gz"}, {"config_name": "2011-04", "data_files": "files/2011-04.jsonl.gz"}, {"config_name": "2011-05", "data_files": "files/2011-05.jsonl.gz"}, {"config_name": "2011-06", "data_files": "files/2011-06.jsonl.gz"}, {"config_name": "2011-07", "data_files": "files/2011-07.jsonl.gz"}, {"config_name": "2011-08", "data_files": "files/2011-08.jsonl.gz"}, {"config_name": "2011-09", "data_files": "files/2011-09.jsonl.gz"}, {"config_name": "2011-10", "data_files": "files/2011-10.jsonl.gz"}, {"config_name": "2011-11", "data_files": "files/2011-11.jsonl.gz"}, {"config_name": "2011-12", "data_files": "files/2011-12.jsonl.gz"}, {"config_name": "2012-01", "data_files": "files/2012-01.jsonl.gz"}, {"config_name": "2012-02", "data_files": "files/2012-02.jsonl.gz"}, {"config_name": "2012-03", "data_files": "files/2012-03.jsonl.gz"}, {"config_name": "2012-04", "data_files": "files/2012-04.jsonl.gz"}, {"config_name": "2012-05", "data_files": "files/2012-05.jsonl.gz"}, {"config_name": "2012-06", "data_files": "files/2012-06.jsonl.gz"}, {"config_name": "2012-07", "data_files": "files/2012-07.jsonl.gz"}, {"config_name": "2012-08", "data_files": "files/2012-08.jsonl.gz"}, {"config_name": "2012-09", "data_files": "files/2012-09.jsonl.gz"}, {"config_name": "2012-10", "data_files": "files/2012-10.jsonl.gz"}, {"config_name": "2012-11", "data_files": "files/2012-11.jsonl.gz"}, {"config_name": "2012-12", "data_files": "files/2012-12.jsonl.gz"}, {"config_name": "2013-01", "data_files": "files/2013-01.jsonl.gz"}, {"config_name": "2013-02", "data_files": "files/2013-02.jsonl.gz"}, {"config_name": "2013-03", "data_files": "files/2013-03.jsonl.gz"}, {"config_name": "2013-04", "data_files": "files/2013-04.jsonl.gz"}, {"config_name": "2013-05", "data_files": "files/2013-05.jsonl.gz"}, {"config_name": "2013-06", "data_files": "files/2013-06.jsonl.gz"}, {"config_name": "2013-07", "data_files": "files/2013-07.jsonl.gz"}, {"config_name": "2013-08", "data_files": "files/2013-08.jsonl.gz"}, {"config_name": "2013-09", "data_files": "files/2013-09.jsonl.gz"}, {"config_name": "2013-10", "data_files": "files/2013-10.jsonl.gz"}, {"config_name": "2013-11", "data_files": "files/2013-11.jsonl.gz"}, {"config_name": "2013-12", "data_files": "files/2013-12.jsonl.gz"}, {"config_name": "2014-01", "data_files": "files/2014-01.jsonl.gz"}, {"config_name": "2014-02", "data_files": "files/2014-02.jsonl.gz"}, {"config_name": "2014-03", "data_files": "files/2014-03.jsonl.gz"}, {"config_name": "2014-04", "data_files": "files/2014-04.jsonl.gz"}, {"config_name": "2014-05", "data_files": "files/2014-05.jsonl.gz"}, {"config_name": "2014-06", "data_files": "files/2014-06.jsonl.gz"}, {"config_name": "2014-07", "data_files": "files/2014-07.jsonl.gz"}, {"config_name": "2014-08", "data_files": "files/2014-08.jsonl.gz"}, {"config_name": "2014-09", "data_files": "files/2014-09.jsonl.gz"}, {"config_name": "2014-10", "data_files": "files/2014-10.jsonl.gz"}, {"config_name": "2014-11", "data_files": "files/2014-11.jsonl.gz"}, {"config_name": "2014-12", "data_files": "files/2014-12.jsonl.gz"}, {"config_name": "2015-01", "data_files": "files/2015-01.jsonl.gz"}, {"config_name": "2015-02", "data_files": "files/2015-02.jsonl.gz"}, {"config_name": "2015-03", "data_files": "files/2015-03.jsonl.gz"}, {"config_name": "2015-04", "data_files": "files/2015-04.jsonl.gz"}, {"config_name": "2015-05", "data_files": "files/2015-05.jsonl.gz"}, {"config_name": "2015-06", "data_files": "files/2015-06.jsonl.gz"}, {"config_name": "2015-07", "data_files": "files/2015-07.jsonl.gz"}, {"config_name": "2015-08", "data_files": "files/2015-08.jsonl.gz"}, {"config_name": "2015-09", "data_files": "files/2015-09.jsonl.gz"}, {"config_name": "2015-10", "data_files": "files/2015-10.jsonl.gz"}, {"config_name": "2015-11", "data_files": "files/2015-11.jsonl.gz"}, {"config_name": "2015-12", "data_files": "files/2015-12.jsonl.gz"}, {"config_name": "2016-01", "data_files": "files/2016-01.jsonl.gz"}, {"config_name": "2016-02", "data_files": "files/2016-02.jsonl.gz"}, {"config_name": "2016-03", "data_files": "files/2016-03.jsonl.gz"}, {"config_name": "2016-04", "data_files": "files/2016-04.jsonl.gz"}, {"config_name": "2016-05", "data_files": "files/2016-05.jsonl.gz"}, {"config_name": "2016-06", "data_files": "files/2016-06.jsonl.gz"}, {"config_name": "2016-07", "data_files": "files/2016-07.jsonl.gz"}, {"config_name": "2016-08", "data_files": "files/2016-08.jsonl.gz"}, {"config_name": "2016-09", "data_files": "files/2016-09.jsonl.gz"}, {"config_name": "2016-10", "data_files": "files/2016-10.jsonl.gz"}, {"config_name": "2016-11", "data_files": "files/2016-11.jsonl.gz"}, {"config_name": "2016-12", "data_files": "files/2016-12.jsonl.gz"}, {"config_name": "2017-01", "data_files": "files/2017-01.jsonl.gz"}, {"config_name": "2017-02", "data_files": "files/2017-02.jsonl.gz"}, {"config_name": "2017-03", "data_files": "files/2017-03.jsonl.gz"}, {"config_name": "2017-04", "data_files": "files/2017-04.jsonl.gz"}, {"config_name": "2017-05", "data_files": "files/2017-05.jsonl.gz"}, {"config_name": "2017-06", "data_files": "files/2017-06.jsonl.gz"}, {"config_name": "2017-07", "data_files": "files/2017-07.jsonl.gz"}, {"config_name": "2017-08", "data_files": "files/2017-08.jsonl.gz"}, {"config_name": "2017-09", "data_files": "files/2017-09.jsonl.gz"}, {"config_name": "2017-10", "data_files": "files/2017-10.jsonl.gz"}, {"config_name": "2017-11", "data_files": "files/2017-11.jsonl.gz"}, {"config_name": "2017-12", "data_files": "files/2017-12.jsonl.gz"}, {"config_name": "2018-01", "data_files": "files/2018-01.jsonl.gz"}, {"config_name": "2018-02", "data_files": "files/2018-02.jsonl.gz"}, {"config_name": "2018-03", "data_files": "files/2018-03.jsonl.gz"}, {"config_name": "2018-04", "data_files": "files/2018-04.jsonl.gz"}, {"config_name": "2018-05", "data_files": "files/2018-05.jsonl.gz"}, {"config_name": "2018-06", "data_files": "files/2018-06.jsonl.gz"}, {"config_name": "2018-07", "data_files": "files/2018-07.jsonl.gz"}, {"config_name": "2018-08", "data_files": "files/2018-08.jsonl.gz"}, {"config_name": "2018-09", "data_files": "files/2018-09.jsonl.gz"}, {"config_name": "2018-10", "data_files": "files/2018-10.jsonl.gz"}, {"config_name": "2018-11", "data_files": "files/2018-11.jsonl.gz"}, {"config_name": "2018-12", "data_files": "files/2018-12.jsonl.gz"}, {"config_name": "2019-01", "data_files": "files/2019-01.jsonl.gz"}, {"config_name": "2019-02", "data_files": "files/2019-02.jsonl.gz"}, {"config_name": "2019-03", "data_files": "files/2019-03.jsonl.gz"}, {"config_name": "2019-04", "data_files": "files/2019-04.jsonl.gz"}, {"config_name": "2019-05", "data_files": "files/2019-05.jsonl.gz"}, {"config_name": "2019-06", "data_files": "files/2019-06.jsonl.gz"}, {"config_name": "2019-07", "data_files": "files/2019-07.jsonl.gz"}, {"config_name": "2019-08", "data_files": "files/2019-08.jsonl.gz"}, {"config_name": "2019-09", "data_files": "files/2019-09.jsonl.gz"}, {"config_name": "2019-10", "data_files": "files/2019-10.jsonl.gz"}, {"config_name": "2019-11", "data_files": "files/2019-11.jsonl.gz"}, {"config_name": "2019-12", "data_files": "files/2019-12.jsonl.gz"}, {"config_name": "2020-01", "data_files": "files/2020-01.jsonl.gz"}, {"config_name": "2020-02", "data_files": "files/2020-02.jsonl.gz"}, {"config_name": "2020-03", "data_files": "files/2020-03.jsonl.gz"}, {"config_name": "2020-04", "data_files": "files/2020-04.jsonl.gz"}, {"config_name": "2020-05", "data_files": "files/2020-05.jsonl.gz"}, {"config_name": "2020-06", "data_files": "files/2020-06.jsonl.gz"}, {"config_name": "2020-07", "data_files": "files/2020-07.jsonl.gz"}, {"config_name": "2020-08", "data_files": "files/2020-08.jsonl.gz"}, {"config_name": "2020-09", "data_files": "files/2020-09.jsonl.gz"}, {"config_name": "2020-10", "data_files": "files/2020-10.jsonl.gz"}, {"config_name": "2020-11", "data_files": "files/2020-11.jsonl.gz"}, {"config_name": "2020-12", "data_files": "files/2020-12.jsonl.gz"}, {"config_name": "2021-01", "data_files": "files/2021-01.jsonl.gz"}, {"config_name": "2021-02", "data_files": "files/2021-02.jsonl.gz"}, {"config_name": "2021-03", "data_files": "files/2021-03.jsonl.gz"}, {"config_name": "2021-04", "data_files": "files/2021-04.jsonl.gz"}, {"config_name": "2021-05", "data_files": "files/2021-05.jsonl.gz"}, {"config_name": "2021-06", "data_files": "files/2021-06.jsonl.gz"}, {"config_name": "2021-07", "data_files": "files/2021-07.jsonl.gz"}, {"config_name": "2021-08", "data_files": "files/2021-08.jsonl.gz"}, {"config_name": "2021-09", "data_files": "files/2021-09.jsonl.gz"}, {"config_name": "2021-10", "data_files": "files/2021-10.jsonl.gz"}, {"config_name": "2021-11", "data_files": "files/2021-11.jsonl.gz"}, {"config_name": "2021-12", "data_files": "files/2021-12.jsonl.gz"}, {"config_name": "2022-01", "data_files": "files/2022-01.jsonl.gz"}, {"config_name": "2022-02", "data_files": "files/2022-02.jsonl.gz"}, {"config_name": "2022-03", "data_files": "files/2022-03.jsonl.gz"}, {"config_name": "2022-04", "data_files": "files/2022-04.jsonl.gz"}, {"config_name": "2022-05", "data_files": "files/2022-05.jsonl.gz"}, {"config_name": "2022-06", "data_files": "files/2022-06.jsonl.gz"}, {"config_name": "2022-07", "data_files": "files/2022-07.jsonl.gz"}, {"config_name": "2022-08", "data_files": "files/2022-08.jsonl.gz"}, {"config_name": "2022-09", "data_files": "files/2022-09.jsonl.gz"}, {"config_name": "2022-10", "data_files": "files/2022-10.jsonl.gz"}, {"config_name": "2022-11", "data_files": "files/2022-11.jsonl.gz"}, {"config_name": "2022-12", "data_files": "files/2022-12.jsonl.gz"}, {"config_name": "2023-01", "data_files": "files/2023-01.jsonl.gz"}, {"config_name": "2023-02", "data_files": "files/2023-02.jsonl.gz"}, {"config_name": "2023-03", "data_files": "files/2023-03.jsonl.gz"}, {"config_name": "2023-04", "data_files": "files/2023-04.jsonl.gz"}, {"config_name": "2023-05", "data_files": "files/2023-05.jsonl.gz"}, {"config_name": "2023-06", "data_files": "files/2023-06.jsonl.gz"}, {"config_name": "2023-07", "data_files": "files/2023-07.jsonl.gz"}, {"config_name": "2023-08", "data_files": "files/2023-08.jsonl.gz"}, {"config_name": "2023-09", "data_files": "files/2023-09.jsonl.gz"}, {"config_name": "2023-10", "data_files": "files/2023-10.jsonl.gz"}, {"config_name": "2023-11", "data_files": "files/2023-11.jsonl.gz"}, {"config_name": "2023-12", "data_files": "files/2023-12.jsonl.gz"}]}
|
2024-01-11T15:53:11+00:00
|
[] |
[] |
TAGS
#license-eupl-1.1 #region-us
|
# 🇪🇺 ️ EuroVoc dataset
This dataset contains more that 3,700,000 documents in 39 languages with associated EuroVoc labels.
## What's Cellar ?
Cellar is the common data repository of the Publications Office of the European Union. Digital publications and metadata are stored in and disseminated via Cellar, in order to be used by humans and machines. Aiming to transparently serve users, Cellar stores multilingual publications and metadata, it is open to all EU citizens and provides machine-readable data.
URL
## Why was this dataset created ?
"Extreme classification come with challenges of scalability due to large label spaces, data sparsity issues due to insufficient training samples."
URL
## How was dataset this created ?
The source code is available, check 'URL'
## When this dataset was created ?
14 July 2023
## What are the main characteristics of this dataset ?
There are a total of 39 different languages present in this dataset, of which some are EU languages and some are not. As the following graph illustrates, most of the documents of the dataset are written in EU languages (English being the most present language in the dataset), and the non-EU languages are very poorly represented (for example Arabic, Japanese,...). Note that since the Irish language ('gle') was granted full official and working status in the EU in 2022, there are very few documents in that language. Additionally, Croatian ('hrv') is also less represented in the dataset as Croatia is the latest country to have joined the EU in 2013.
!language graph
The lengths of the documents also varies depending on the language it is written in. The document lengths are quite variable, especially in English. There is therefore a quite large disparity in document lengths in this dataset. Note that this boxplot does not present the outliers, since the length of certain documents can contain up to 86 million characters. The red lines in the boxplot indicates the median length of the documents for each language.
!boxplot
We notice that the documents in Irish have a very wide variability in document lengths, due to the fact it has very few documents. Therefore, we present the same boxplot without the Irish language in order to visualize with more detail the document length distribution in the other languages.
!boxplot
## How is the data structured ?
An example of a sample of this dataset is the following :
- 'title' : title of the document
- 'date' : publication date of the document
- 'eurovoc_concepts' : list of the EuroVoc concepts related to this document
- 'url' : URL to access the document
- 'formats' : list of formats in which the original document is available
- 'text' : text content of the document
## Bibliography
- Ilias Chalkidis, Emmanouil Fergadiotis, Prodromos Malakasiotis, Nikolaos Aletras, and Ion Androutsopoulos. 2019. Extreme Multi-Label Legal Text Classification: A Case Study in EU Legislation. In Proceedings of the Natural Legal Language Processing Workshop 2019, pages 78–87, Minneapolis, Minnesota. Association for Computational Linguistics.
- I. Chalkidis, M. Fergadiotis, P. Malakasiotis and I. Androutsopoulos, Large-Scale Multi-Label Text Classification on EU Legislation. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019), Florence, Italy, (short papers), 2019.
- Andrei-Marius Avram, Vasile Pais, and Dan Ioan Tufis. 2021. PyEuroVoc: A Tool for Multilingual Legal Document Classification with EuroVoc Descriptors. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 92–101, Held Online. INCOMA Ltd..
- SHAHEEN, Zein, WOHLGENANNT, Gerhard, et FILTZ, Erwin. Large scale legal text classification using transformer models. arXiv preprint arXiv:2010.12871, 2020.
## Author(s)
Sébastien Campion <sebastien.campion@URL>
|
[
"# 🇪🇺 ️ EuroVoc dataset\n\nThis dataset contains more that 3,700,000 documents in 39 languages with associated EuroVoc labels.",
"## What's Cellar ? \n\nCellar is the common data repository of the Publications Office of the European Union. Digital publications and metadata are stored in and disseminated via Cellar, in order to be used by humans and machines. Aiming to transparently serve users, Cellar stores multilingual publications and metadata, it is open to all EU citizens and provides machine-readable data.\n\nURL",
"## Why was this dataset created ?\n\n\"Extreme classification come with challenges of scalability due to large label spaces, data sparsity issues due to insufficient training samples.\"\n\nURL",
"## How was dataset this created ? \n\nThe source code is available, check 'URL'",
"## When this dataset was created ? \n\n14 July 2023",
"## What are the main characteristics of this dataset ?\n\nThere are a total of 39 different languages present in this dataset, of which some are EU languages and some are not. As the following graph illustrates, most of the documents of the dataset are written in EU languages (English being the most present language in the dataset), and the non-EU languages are very poorly represented (for example Arabic, Japanese,...). Note that since the Irish language ('gle') was granted full official and working status in the EU in 2022, there are very few documents in that language. Additionally, Croatian ('hrv') is also less represented in the dataset as Croatia is the latest country to have joined the EU in 2013.\n\n!language graph\n\nThe lengths of the documents also varies depending on the language it is written in. The document lengths are quite variable, especially in English. There is therefore a quite large disparity in document lengths in this dataset. Note that this boxplot does not present the outliers, since the length of certain documents can contain up to 86 million characters. The red lines in the boxplot indicates the median length of the documents for each language.\n\n!boxplot\n\nWe notice that the documents in Irish have a very wide variability in document lengths, due to the fact it has very few documents. Therefore, we present the same boxplot without the Irish language in order to visualize with more detail the document length distribution in the other languages.\n\n!boxplot",
"## How is the data structured ?\n\nAn example of a sample of this dataset is the following :\n\n- 'title' : title of the document\n- 'date' : publication date of the document\n- 'eurovoc_concepts' : list of the EuroVoc concepts related to this document\n- 'url' : URL to access the document\n- 'formats' : list of formats in which the original document is available\n- 'text' : text content of the document",
"## Bibliography\n\n- Ilias Chalkidis, Emmanouil Fergadiotis, Prodromos Malakasiotis, Nikolaos Aletras, and Ion Androutsopoulos. 2019. Extreme Multi-Label Legal Text Classification: A Case Study in EU Legislation. In Proceedings of the Natural Legal Language Processing Workshop 2019, pages 78–87, Minneapolis, Minnesota. Association for Computational Linguistics.\n- I. Chalkidis, M. Fergadiotis, P. Malakasiotis and I. Androutsopoulos, Large-Scale Multi-Label Text Classification on EU Legislation. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019), Florence, Italy, (short papers), 2019.\n- Andrei-Marius Avram, Vasile Pais, and Dan Ioan Tufis. 2021. PyEuroVoc: A Tool for Multilingual Legal Document Classification with EuroVoc Descriptors. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 92–101, Held Online. INCOMA Ltd..\n- SHAHEEN, Zein, WOHLGENANNT, Gerhard, et FILTZ, Erwin. Large scale legal text classification using transformer models. arXiv preprint arXiv:2010.12871, 2020.",
"## Author(s)\n\nSébastien Campion <sebastien.campion@URL>"
] |
[
"TAGS\n#license-eupl-1.1 #region-us \n",
"# 🇪🇺 ️ EuroVoc dataset\n\nThis dataset contains more that 3,700,000 documents in 39 languages with associated EuroVoc labels.",
"## What's Cellar ? \n\nCellar is the common data repository of the Publications Office of the European Union. Digital publications and metadata are stored in and disseminated via Cellar, in order to be used by humans and machines. Aiming to transparently serve users, Cellar stores multilingual publications and metadata, it is open to all EU citizens and provides machine-readable data.\n\nURL",
"## Why was this dataset created ?\n\n\"Extreme classification come with challenges of scalability due to large label spaces, data sparsity issues due to insufficient training samples.\"\n\nURL",
"## How was dataset this created ? \n\nThe source code is available, check 'URL'",
"## When this dataset was created ? \n\n14 July 2023",
"## What are the main characteristics of this dataset ?\n\nThere are a total of 39 different languages present in this dataset, of which some are EU languages and some are not. As the following graph illustrates, most of the documents of the dataset are written in EU languages (English being the most present language in the dataset), and the non-EU languages are very poorly represented (for example Arabic, Japanese,...). Note that since the Irish language ('gle') was granted full official and working status in the EU in 2022, there are very few documents in that language. Additionally, Croatian ('hrv') is also less represented in the dataset as Croatia is the latest country to have joined the EU in 2013.\n\n!language graph\n\nThe lengths of the documents also varies depending on the language it is written in. The document lengths are quite variable, especially in English. There is therefore a quite large disparity in document lengths in this dataset. Note that this boxplot does not present the outliers, since the length of certain documents can contain up to 86 million characters. The red lines in the boxplot indicates the median length of the documents for each language.\n\n!boxplot\n\nWe notice that the documents in Irish have a very wide variability in document lengths, due to the fact it has very few documents. Therefore, we present the same boxplot without the Irish language in order to visualize with more detail the document length distribution in the other languages.\n\n!boxplot",
"## How is the data structured ?\n\nAn example of a sample of this dataset is the following :\n\n- 'title' : title of the document\n- 'date' : publication date of the document\n- 'eurovoc_concepts' : list of the EuroVoc concepts related to this document\n- 'url' : URL to access the document\n- 'formats' : list of formats in which the original document is available\n- 'text' : text content of the document",
"## Bibliography\n\n- Ilias Chalkidis, Emmanouil Fergadiotis, Prodromos Malakasiotis, Nikolaos Aletras, and Ion Androutsopoulos. 2019. Extreme Multi-Label Legal Text Classification: A Case Study in EU Legislation. In Proceedings of the Natural Legal Language Processing Workshop 2019, pages 78–87, Minneapolis, Minnesota. Association for Computational Linguistics.\n- I. Chalkidis, M. Fergadiotis, P. Malakasiotis and I. Androutsopoulos, Large-Scale Multi-Label Text Classification on EU Legislation. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019), Florence, Italy, (short papers), 2019.\n- Andrei-Marius Avram, Vasile Pais, and Dan Ioan Tufis. 2021. PyEuroVoc: A Tool for Multilingual Legal Document Classification with EuroVoc Descriptors. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 92–101, Held Online. INCOMA Ltd..\n- SHAHEEN, Zein, WOHLGENANNT, Gerhard, et FILTZ, Erwin. Large scale legal text classification using transformer models. arXiv preprint arXiv:2010.12871, 2020.",
"## Author(s)\n\nSébastien Campion <sebastien.campion@URL>"
] |
[
14,
36,
92,
41,
18,
11,
328,
99,
303,
20
] |
[
"passage: TAGS\n#license-eupl-1.1 #region-us \n# 🇪🇺 ️ EuroVoc dataset\n\nThis dataset contains more that 3,700,000 documents in 39 languages with associated EuroVoc labels.## What's Cellar ? \n\nCellar is the common data repository of the Publications Office of the European Union. Digital publications and metadata are stored in and disseminated via Cellar, in order to be used by humans and machines. Aiming to transparently serve users, Cellar stores multilingual publications and metadata, it is open to all EU citizens and provides machine-readable data.\n\nURL## Why was this dataset created ?\n\n\"Extreme classification come with challenges of scalability due to large label spaces, data sparsity issues due to insufficient training samples.\"\n\nURL## How was dataset this created ? \n\nThe source code is available, check 'URL'## When this dataset was created ? \n\n14 July 2023"
] |
e30acfc72794cb1570a16cc15b6523e1b142825d
|
# Dataset Card for "giant-midi-sustain-quantized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
JasiekKaczmarczyk/giant-midi-sustain-quantized
|
[
"region:us"
] |
2023-09-01T07:00:40+00:00
|
{"dataset_info": {"features": [{"name": "midi_filename", "dtype": "string"}, {"name": "pitch", "sequence": "int16", "length": 128}, {"name": "dstart", "sequence": "float32", "length": 128}, {"name": "duration", "sequence": "float32", "length": 128}, {"name": "velocity", "sequence": "int16", "length": 128}, {"name": "dstart_bin", "sequence": "int8", "length": 128}, {"name": "duration_bin", "sequence": "int8", "length": 128}, {"name": "velocity_bin", "sequence": "int8", "length": 128}], "splits": [{"name": "train", "num_bytes": 473899450, "num_examples": 238919}, {"name": "validation", "num_bytes": 58421208, "num_examples": 29453}, {"name": "test", "num_bytes": 56581945, "num_examples": 28531}], "download_size": 0, "dataset_size": 588902603}}
|
2023-09-15T09:33:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "giant-midi-sustain-quantized"
More Information needed
|
[
"# Dataset Card for \"giant-midi-sustain-quantized\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"giant-midi-sustain-quantized\"\n\nMore Information needed"
] |
[
6,
21
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"giant-midi-sustain-quantized\"\n\nMore Information needed"
] |
081bd81a42eaf65e40fcb21b02fbe3fab102c757
|
# Dataset Card for "alt_potterverse"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
mickume/alt_potterverse
|
[
"region:us"
] |
2023-09-01T07:15:27+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 633374028, "num_examples": 3509338}], "download_size": 392101893, "dataset_size": 633374028}}
|
2023-11-03T06:34:35+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "alt_potterverse"
More Information needed
|
[
"# Dataset Card for \"alt_potterverse\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"alt_potterverse\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"alt_potterverse\"\n\nMore Information needed"
] |
b8e9244027b25c9be13b08340448f786e0297446
|
# Dataset Card for "maestro-sustain-quantized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
JasiekKaczmarczyk/maestro-sustain-quantized
|
[
"region:us"
] |
2023-09-01T07:29:47+00:00
|
{"dataset_info": {"features": [{"name": "midi_filename", "dtype": "string"}, {"name": "pitch", "sequence": "int16", "length": 128}, {"name": "dstart", "sequence": "float32", "length": 128}, {"name": "duration", "sequence": "float32", "length": 128}, {"name": "velocity", "sequence": "int16", "length": 128}, {"name": "dstart_bin", "sequence": "int8", "length": 128}, {"name": "duration_bin", "sequence": "int8", "length": 128}, {"name": "velocity_bin", "sequence": "int8", "length": 128}], "splits": [{"name": "train", "num_bytes": 89689142, "num_examples": 43727}, {"name": "validation", "num_bytes": 10114654, "num_examples": 4929}, {"name": "test", "num_bytes": 11695068, "num_examples": 5695}], "download_size": 0, "dataset_size": 111498864}}
|
2023-09-15T09:26:58+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "maestro-sustain-quantized"
More Information needed
|
[
"# Dataset Card for \"maestro-sustain-quantized\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"maestro-sustain-quantized\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"maestro-sustain-quantized\"\n\nMore Information needed"
] |
0015703bfc0dfc34e2914694b8d8a5e55d242a42
|
# Dataset Card for "qg-tagging-normalized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
jxie/qg-tagging-normalized
|
[
"region:us"
] |
2023-09-01T07:30:36+00:00
|
{"dataset_info": {"features": [{"name": "inputs", "sequence": {"sequence": "float64"}}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 6944726400, "num_examples": 1600000}, {"name": "val", "num_bytes": 868957000, "num_examples": 200000}, {"name": "test", "num_bytes": 868286700, "num_examples": 200000}], "download_size": 3812296127, "dataset_size": 8681970100}}
|
2023-09-01T07:38:06+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "qg-tagging-normalized"
More Information needed
|
[
"# Dataset Card for \"qg-tagging-normalized\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"qg-tagging-normalized\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"qg-tagging-normalized\"\n\nMore Information needed"
] |
60205fdfbd1d3df54474d23c2d9064c23685cf63
|
# Dataset Card for "trichomes_moment_lens_instance_segmentation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
etaylor/trichomes_moment_lens_instance_segmentation
|
[
"region:us"
] |
2023-09-01T07:33:02+00:00
|
{"dataset_info": {"features": [{"name": "pixel_values", "dtype": "image"}, {"name": "label", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 52395517.0, "num_examples": 51}], "download_size": 4014954, "dataset_size": 52395517.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-01T07:33:09+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "trichomes_moment_lens_instance_segmentation"
More Information needed
|
[
"# Dataset Card for \"trichomes_moment_lens_instance_segmentation\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"trichomes_moment_lens_instance_segmentation\"\n\nMore Information needed"
] |
[
6,
25
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"trichomes_moment_lens_instance_segmentation\"\n\nMore Information needed"
] |
c0c92a2ac55951030c7eb4e0fc068224535c1b46
|
# Dataset Card for Evaluation run of TheBloke/Platypus2-70B-Instruct-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Platypus2-70B-Instruct-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Platypus2-70B-Instruct-GPTQ](https://huggingface.co/TheBloke/Platypus2-70B-Instruct-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Platypus2-70B-Instruct-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-01T08:39:03.285201](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Platypus2-70B-Instruct-GPTQ/blob/main/results_2023-09-01T08%3A39%3A03.285201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6985296232204664,
"acc_stderr": 0.03125037426870383,
"acc_norm": 0.7020835749710057,
"acc_norm_stderr": 0.031223245232596956,
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.6253657801165746,
"mc2_stderr": 0.01474854589221215
},
"harness|arc:challenge|25": {
"acc": 0.6919795221843004,
"acc_stderr": 0.013491429517292038,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266129
},
"harness|hellaswag|10": {
"acc": 0.6863174666401115,
"acc_stderr": 0.004630407476835178,
"acc_norm": 0.8755228042222665,
"acc_norm_stderr": 0.003294504807555233
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7471698113207547,
"acc_stderr": 0.026749899771241214,
"acc_norm": 0.7471698113207547,
"acc_norm_stderr": 0.026749899771241214
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948614,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948614
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.03078373675774565,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.03078373675774565
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4656084656084656,
"acc_stderr": 0.02569032176249384,
"acc_norm": 0.4656084656084656,
"acc_norm_stderr": 0.02569032176249384
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.025485498373343237,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.025485498373343237
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.02482590979334334,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.02482590979334334
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.02684151432295894,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.02684151432295894
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.908256880733945,
"acc_stderr": 0.012376323409137116,
"acc_norm": 0.908256880733945,
"acc_norm_stderr": 0.012376323409137116
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969427,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065494,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802277,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802277
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752596,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752596
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035216,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035216
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237103,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237103
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499978,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.02269865716785571,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.02269865716785571
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.646927374301676,
"acc_stderr": 0.01598420454526858,
"acc_norm": 0.646927374301676,
"acc_norm_stderr": 0.01598420454526858
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046105,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046105
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.02103851777015737,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.02103851777015737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5673758865248227,
"acc_stderr": 0.029555454236778852,
"acc_norm": 0.5673758865248227,
"acc_norm_stderr": 0.029555454236778852
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5860495436766623,
"acc_stderr": 0.012579699631289262,
"acc_norm": 0.5860495436766623,
"acc_norm_stderr": 0.012579699631289262
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7132352941176471,
"acc_stderr": 0.027472274473233818,
"acc_norm": 0.7132352941176471,
"acc_norm_stderr": 0.027472274473233818
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.01736247376214661,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.01736247376214661
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160875,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160875
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.6253657801165746,
"mc2_stderr": 0.01474854589221215
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__Platypus2-70B-Instruct-GPTQ
|
[
"region:us"
] |
2023-09-01T07:39:28+00:00
|
{"pretty_name": "Evaluation run of TheBloke/Platypus2-70B-Instruct-GPTQ", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Platypus2-70B-Instruct-GPTQ](https://huggingface.co/TheBloke/Platypus2-70B-Instruct-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Platypus2-70B-Instruct-GPTQ\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-01T08:39:03.285201](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Platypus2-70B-Instruct-GPTQ/blob/main/results_2023-09-01T08%3A39%3A03.285201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6985296232204664,\n \"acc_stderr\": 0.03125037426870383,\n \"acc_norm\": 0.7020835749710057,\n \"acc_norm_stderr\": 0.031223245232596956,\n \"mc1\": 0.4455324357405141,\n \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.6253657801165746,\n \"mc2_stderr\": 0.01474854589221215\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6919795221843004,\n \"acc_stderr\": 0.013491429517292038,\n \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266129\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6863174666401115,\n \"acc_stderr\": 0.004630407476835178,\n \"acc_norm\": 0.8755228042222665,\n \"acc_norm_stderr\": 0.003294504807555233\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.026749899771241214,\n \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.026749899771241214\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.031164899666948614,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.031164899666948614\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.03078373675774565,\n \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.03078373675774565\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4656084656084656,\n \"acc_stderr\": 0.02569032176249384,\n \"acc_norm\": 0.4656084656084656,\n \"acc_norm_stderr\": 0.02569032176249384\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.025485498373343237,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.025485498373343237\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334334,\n \"acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334334\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.02684151432295894,\n \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.02684151432295894\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.908256880733945,\n \"acc_stderr\": 0.012376323409137116,\n \"acc_norm\": 0.908256880733945,\n \"acc_norm_stderr\": 0.012376323409137116\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065494,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.026936111912802277,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.026936111912802277\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752596,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752596\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035216,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035216\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237103,\n \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237103\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.02269865716785571,\n \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.02269865716785571\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.646927374301676,\n \"acc_stderr\": 0.01598420454526858,\n \"acc_norm\": 0.646927374301676,\n \"acc_norm_stderr\": 0.01598420454526858\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046105,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046105\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.02103851777015737,\n \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.02103851777015737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5673758865248227,\n \"acc_stderr\": 0.029555454236778852,\n \"acc_norm\": 0.5673758865248227,\n \"acc_norm_stderr\": 0.029555454236778852\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5860495436766623,\n \"acc_stderr\": 0.012579699631289262,\n \"acc_norm\": 0.5860495436766623,\n \"acc_norm_stderr\": 0.012579699631289262\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233818,\n \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233818\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7565359477124183,\n \"acc_stderr\": 0.01736247376214661,\n \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.01736247376214661\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.02653704531214529,\n \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.02653704531214529\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160875,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160875\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4455324357405141,\n \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.6253657801165746,\n \"mc2_stderr\": 0.01474854589221215\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Platypus2-70B-Instruct-GPTQ", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|arc:challenge|25_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hellaswag|10_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T08:39:03.285201.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T08:39:03.285201.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_01T08_39_03.285201", "path": ["results_2023-09-01T08:39:03.285201.parquet"]}, {"split": "latest", "path": ["results_2023-09-01T08:39:03.285201.parquet"]}]}]}
|
2023-09-01T07:40:25+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/Platypus2-70B-Instruct-GPTQ
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/Platypus2-70B-Instruct-GPTQ on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-01T08:39:03.285201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/Platypus2-70B-Instruct-GPTQ",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Platypus2-70B-Instruct-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-01T08:39:03.285201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/Platypus2-70B-Instruct-GPTQ",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Platypus2-70B-Instruct-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-01T08:39:03.285201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
26,
31,
174,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Platypus2-70B-Instruct-GPTQ## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Platypus2-70B-Instruct-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-01T08:39:03.285201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
0b91d4d6ca0a486ce8f212e7592e659817436095
|
This is a 21 class land use image dataset meant for research purposes.
There are 100 images for each of the following classes:
- agricultural
- airplane
- baseballdiamond
- beach
- buildings
- chaparral
- denseresidential
- forest
- freeway
- golfcourse
- harbor
- intersection
- mediumresidential
- mobilehomepark
- overpass
- parkinglot
- river
- runway
- sparseresidential
- storagetanks
- tenniscourt
Each image measures 256x256 pixels.
The images were manually extracted from large images from the USGS National Map Urban Area Imagery collection for various urban areas around the country. The pixel resolution of this public domain imagery is 1 foot.
### Original Dataset Source
For more information about the original UC Merced Land Use dataset, please visit the official dataset page:
[UC Merced Land Use Dataset](http://weegee.vision.ucmerced.edu/datasets/landuse.html)
Please refer to the original dataset source for any additional details, citations, or specific usage guidelines provided by the dataset creators.
|
SatwikKambham/uc_merced_land_use
|
[
"task_categories:image-classification",
"license:cc0-1.0",
"region:us"
] |
2023-09-01T07:45:40+00:00
|
{"license": "cc0-1.0", "task_categories": ["image-classification"], "pretty_name": "UC Merced Land Use", "dataset_info": {"config_name": "ucmerced_landuse", "features": [{"name": "img", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "agricultural", "1": "airplane", "2": "baseballdiamond", "3": "beach", "4": "buildings", "5": "chaparral", "6": "denseresidential", "7": "forest", "8": "freeway", "9": "golfcourse", "10": "harbor", "11": "intersection", "12": "mediumresidential", "13": "mobilehomepark", "14": "overpass", "15": "parkinglot", "16": "river", "17": "runway", "18": "sparseresidential", "19": "storagetanks", "20": "tenniscourt"}}}}], "splits": [{"name": "train", "num_bytes": 406563, "num_examples": 2100}], "download_size": 332468434, "dataset_size": 406563}}
|
2023-09-04T18:12:48+00:00
|
[] |
[] |
TAGS
#task_categories-image-classification #license-cc0-1.0 #region-us
|
This is a 21 class land use image dataset meant for research purposes.
There are 100 images for each of the following classes:
- agricultural
- airplane
- baseballdiamond
- beach
- buildings
- chaparral
- denseresidential
- forest
- freeway
- golfcourse
- harbor
- intersection
- mediumresidential
- mobilehomepark
- overpass
- parkinglot
- river
- runway
- sparseresidential
- storagetanks
- tenniscourt
Each image measures 256x256 pixels.
The images were manually extracted from large images from the USGS National Map Urban Area Imagery collection for various urban areas around the country. The pixel resolution of this public domain imagery is 1 foot.
### Original Dataset Source
For more information about the original UC Merced Land Use dataset, please visit the official dataset page:
UC Merced Land Use Dataset
Please refer to the original dataset source for any additional details, citations, or specific usage guidelines provided by the dataset creators.
|
[
"### Original Dataset Source\n\nFor more information about the original UC Merced Land Use dataset, please visit the official dataset page:\n\nUC Merced Land Use Dataset\n\nPlease refer to the original dataset source for any additional details, citations, or specific usage guidelines provided by the dataset creators."
] |
[
"TAGS\n#task_categories-image-classification #license-cc0-1.0 #region-us \n",
"### Original Dataset Source\n\nFor more information about the original UC Merced Land Use dataset, please visit the official dataset page:\n\nUC Merced Land Use Dataset\n\nPlease refer to the original dataset source for any additional details, citations, or specific usage guidelines provided by the dataset creators."
] |
[
25,
63
] |
[
"passage: TAGS\n#task_categories-image-classification #license-cc0-1.0 #region-us \n### Original Dataset Source\n\nFor more information about the original UC Merced Land Use dataset, please visit the official dataset page:\n\nUC Merced Land Use Dataset\n\nPlease refer to the original dataset source for any additional details, citations, or specific usage guidelines provided by the dataset creators."
] |
3fcdc88a249db6a28aa32d99ff48d3344d71bf23
|
# Dataset Card for "news_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
paoloitaliani/news_articles
|
[
"region:us"
] |
2023-09-01T07:49:48+00:00
|
{"dataset_info": [{"config_name": "corriere_autunno", "features": [{"name": "author", "dtype": "string"}, {"name": "journal", "dtype": "string"}, {"name": "body", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 339578, "num_examples": 90}], "download_size": 237083, "dataset_size": 339578}, {"config_name": "corriere_primavera", "features": [{"name": "author", "dtype": "string"}, {"name": "journal", "dtype": "string"}, {"name": "body", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 319422, "num_examples": 105}], "download_size": 206264, "dataset_size": 319422}, {"config_name": "fattoq_autunno", "features": [{"name": "author", "dtype": "string"}, {"name": "journal", "dtype": "string"}, {"name": "body", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 519012, "num_examples": 133}], "download_size": 338948, "dataset_size": 519012}, {"config_name": "fattoq_primavera", "features": [{"name": "author", "dtype": "string"}, {"name": "journal", "dtype": "string"}, {"name": "body", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 508621, "num_examples": 152}], "download_size": 331977, "dataset_size": 508621}, {"config_name": "ukraine", "features": [{"name": "date", "dtype": "timestamp[ns]"}, {"name": "body", "dtype": "string"}, {"name": "author", "dtype": "string"}, {"name": "journal", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 81923456, "num_examples": 27449}], "download_size": 0, "dataset_size": 81923456}], "configs": [{"config_name": "corriere_autunno", "data_files": [{"split": "train", "path": "corriere_autunno/train-*"}]}, {"config_name": "corriere_primavera", "data_files": [{"split": "train", "path": "corriere_primavera/train-*"}]}, {"config_name": "fattoq_autunno", "data_files": [{"split": "train", "path": "fattoq_autunno/train-*"}]}, {"config_name": "fattoq_primavera", "data_files": [{"split": "train", "path": "fattoq_primavera/train-*"}]}, {"config_name": "ukraine", "data_files": [{"split": "train", "path": "ukraine/train-*"}]}]}
|
2024-01-17T09:15:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "news_articles"
More Information needed
|
[
"# Dataset Card for \"news_articles\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"news_articles\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"news_articles\"\n\nMore Information needed"
] |
6a5644255c7bb87218758d5ad544fef0fc0d4cd3
|
# !After Detailer
!After Detailer is a extension for stable diffusion webui, similar to Detection Detailer, except it uses ultralytics instead of the mmdet.
## Install
(from Mikubill/sd-webui-controlnet)
1. Open "Extensions" tab.
2. Open "Install from URL" tab in the tab.
3. Enter `https://github.com/Bing-su/adetailer.git` to "URL for extension's git repository".
4. Press "Install" button.
5. Wait 5 seconds, and you will see the message "Installed into stable-diffusion-webui\extensions\adetailer. Use Installed tab to restart".
6. Go to "Installed" tab, click "Check for updates", and then click "Apply and restart UI". (The next time you can also use this method to update extensions.)
7. Completely restart A1111 webui including your terminal. (If you do not know what is a "terminal", you can reboot your computer: turn your computer off and turn it on again.)
You can now install it directly from the Extensions tab.

You **DON'T** need to download any model from huggingface.
## Options
| Model, Prompts | | |
| --------------------------------- | ------------------------------------- | ------------------------------------------------- |
| ADetailer model | Determine what to detect. | `None` = disable |
| ADetailer prompt, negative prompt | Prompts and negative prompts to apply | If left blank, it will use the same as the input. |
| Detection | | |
| ------------------------------------ | -------------------------------------------------------------------------------------------- | --- |
| Detection model confidence threshold | Only objects with a detection model confidence above this threshold are used for inpainting. | |
| Mask min/max ratio | Only use masks whose area is between those ratios for the area of the entire image. | |
If you want to exclude objects in the background, try setting the min ratio to around `0.01`.
| Mask Preprocessing | | |
| ------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------- |
| Mask x, y offset | Moves the mask horizontally and vertically by | |
| Mask erosion (-) / dilation (+) | Enlarge or reduce the detected mask. | [opencv example](https://docs.opencv.org/4.7.0/db/df6/tutorial_erosion_dilatation.html) |
| Mask merge mode | `None`: Inpaint each mask<br/>`Merge`: Merge all masks and inpaint<br/>`Merge and Invert`: Merge all masks and Invert, then inpaint | |
Applied in this order: x, y offset → erosion/dilation → merge/invert.
#### Inpainting
Each option corresponds to a corresponding option on the inpaint tab. Therefore, please refer to the inpaint tab for usage details on how to use each option.
## ControlNet Inpainting
You can use the ControlNet extension if you have ControlNet installed and ControlNet models.
Support `inpaint, scribble, lineart, openpose, tile` controlnet models. Once you choose a model, the preprocessor is set automatically. It works separately from the model set by the Controlnet extension.
## Advanced Options
API request example: [wiki/API](https://github.com/Bing-su/adetailer/wiki/API)
`ui-config.json` entries: [wiki/ui-config.json](https://github.com/Bing-su/adetailer/wiki/ui-config.json)
`[SEP], [SKIP]` tokens: [wiki/Advanced](https://github.com/Bing-su/adetailer/wiki/Advanced)
## Media
- 🎥 [どこよりも詳しいAfter Detailer (adetailer)の使い方① 【Stable Diffusion】](https://youtu.be/sF3POwPUWCE)
- 🎥 [どこよりも詳しいAfter Detailer (adetailer)の使い方② 【Stable Diffusion】](https://youtu.be/urNISRdbIEg)
## Model
| Model | Target | mAP 50 | mAP 50-95 |
| --------------------- | --------------------- | ----------------------------- | ----------------------------- |
| face_yolov8n.pt | 2D / realistic face | 0.660 | 0.366 |
| face_yolov8s.pt | 2D / realistic face | 0.713 | 0.404 |
| hand_yolov8n.pt | 2D / realistic hand | 0.767 | 0.505 |
| person_yolov8n-seg.pt | 2D / realistic person | 0.782 (bbox)<br/>0.761 (mask) | 0.555 (bbox)<br/>0.460 (mask) |
| person_yolov8s-seg.pt | 2D / realistic person | 0.824 (bbox)<br/>0.809 (mask) | 0.605 (bbox)<br/>0.508 (mask) |
| mediapipe_face_full | realistic face | - | - |
| mediapipe_face_short | realistic face | - | - |
| mediapipe_face_mesh | realistic face | - | - |
The yolo models can be found on huggingface [Bingsu/adetailer](https://huggingface.co/Bingsu/adetailer).
### Additional Model
Put your [ultralytics](https://github.com/ultralytics/ultralytics) yolo model in `webui/models/adetailer`. The model name should end with `.pt` or `.pth`.
It must be a bbox detection or segment model and use all label.
## Example


[](https://ko-fi.com/F1F1L7V2N)
|
IsThatOnFire/old-adetailer
|
[
"region:us"
] |
2023-09-01T07:53:38+00:00
|
{}
|
2023-09-01T07:54:43+00:00
|
[] |
[] |
TAGS
#region-us
|
!After Detailer
===============
!After Detailer is a extension for stable diffusion webui, similar to Detection Detailer, except it uses ultralytics instead of the mmdet.
Install
-------
(from Mikubill/sd-webui-controlnet)
1. Open "Extensions" tab.
2. Open "Install from URL" tab in the tab.
3. Enter 'URL to "URL for extension's git repository".
4. Press "Install" button.
5. Wait 5 seconds, and you will see the message "Installed into stable-diffusion-webui\extensions\adetailer. Use Installed tab to restart".
6. Go to "Installed" tab, click "Check for updates", and then click "Apply and restart UI". (The next time you can also use this method to update extensions.)
7. Completely restart A1111 webui including your terminal. (If you do not know what is a "terminal", you can reboot your computer: turn your computer off and turn it on again.)
You can now install it directly from the Extensions tab.
!image
You DON'T need to download any model from huggingface.
Options
-------
Model, Prompts: ADetailer model
Model, Prompts: ADetailer prompt, negative prompt
Detection: Detection model confidence threshold
Detection: Mask min/max ratio
If you want to exclude objects in the background, try setting the min ratio to around '0.01'.
Mask Preprocessing: Mask x, y offset
Mask Preprocessing: Mask erosion (-) / dilation (+)
Mask Preprocessing: Mask merge mode
Applied in this order: x, y offset → erosion/dilation → merge/invert.
#### Inpainting
Each option corresponds to a corresponding option on the inpaint tab. Therefore, please refer to the inpaint tab for usage details on how to use each option.
ControlNet Inpainting
---------------------
You can use the ControlNet extension if you have ControlNet installed and ControlNet models.
Support 'inpaint, scribble, lineart, openpose, tile' controlnet models. Once you choose a model, the preprocessor is set automatically. It works separately from the model set by the Controlnet extension.
Advanced Options
----------------
API request example: wiki/API
'URL' entries: wiki/URL
'[SEP], [SKIP]' tokens: wiki/Advanced
Media
-----
* どこよりも詳しいAfter Detailer (adetailer)の使い方① 【Stable Diffusion】
* どこよりも詳しいAfter Detailer (adetailer)の使い方② 【Stable Diffusion】
Model
-----
The yolo models can be found on huggingface Bingsu/adetailer.
### Additional Model
Put your ultralytics yolo model in 'webui/models/adetailer'. The model name should end with '.pt' or '.pth'.
It must be a bbox detection or segment model and use all label.
Example
-------
!image
!image
の使い方① 【Stable Diffusion】\n* どこよりも詳しいAfter Detailer (adetailer)の使い方② 【Stable Diffusion】\n\n\nModel\n-----\n\n\n\nThe yolo models can be found on huggingface Bingsu/adetailer.",
"### Additional Model\n\n\nPut your ultralytics yolo model in 'webui/models/adetailer'. The model name should end with '.pt' or '.pth'.\n\n\nIt must be a bbox detection or segment model and use all label.\n\n\nExample\n-------\n\n\n!image\n!image\n\n\nの使い方① 【Stable Diffusion】\n* どこよりも詳しいAfter Detailer (adetailer)の使い方② 【Stable Diffusion】\n\n\nModel\n-----\n\n\n\nThe yolo models can be found on huggingface Bingsu/adetailer.",
"### Additional Model\n\n\nPut your ultralytics yolo model in 'webui/models/adetailer'. The model name should end with '.pt' or '.pth'.\n\n\nIt must be a bbox detection or segment model and use all label.\n\n\nExample\n-------\n\n\n!image\n!image\n\n\nの使い方① 【Stable Diffusion】\n* どこよりも詳しいAfter Detailer (adetailer)の使い方② 【Stable Diffusion】\n\n\nModel\n-----\n\n\n\nThe yolo models can be found on huggingface Bingsu/adetailer.### Additional Model\n\n\nPut your ultralytics yolo model in 'webui/models/adetailer'. The model name should end with '.pt' or '.pth'.\n\n\nIt must be a bbox detection or segment model and use all label.\n\n\nExample\n-------\n\n\n!image\n!image\n\n\n![ko-fi](URL"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.