modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-09-22 00:45:16
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 570
values | tags
listlengths 1
4.05k
| pipeline_tag
stringclasses 55
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-09-22 00:43:28
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e0_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T20:45:28Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:03:26Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e1_s55555_v4_l5_v50
|
KingKazma
| 2023-08-13T20:45:13Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T20:12:11Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
nerijs/lego-minifig-xl
|
nerijs
| 2023-08-13T20:43:31Z | 599 | 35 |
diffusers
|
[
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:apache-2.0",
"region:us"
] |
text-to-image
| 2023-08-13T20:36:32Z |
---
license: apache-2.0
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: lego minifig
widget:
- text: lego minifig of a samurai
---
# LEGO Minifig XL
## Consider supporting further research on [Patreon](https://www.patreon.com/user?u=29466374) or [Twitter](https://twitter.com/nerijs)

### Tips:
- Prompt with "lego minifig of a $SUBJECT"-
- Works best at 1024x1024, if you go higher than that will be non-standard size minifigs
- Best used at 0.8 strength
- You can use it for lego items or animals, just remove the "minifig" from the prompt
### Limitations
- Tends to add items to the minifigs, will be fixed on v2
|
bigmorning/whisper_charsplit_new_round2__0055
|
bigmorning
| 2023-08-13T20:40:16Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T20:40:05Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0055
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0055
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0014
- Train Accuracy: 0.0795
- Train Wermet: 7.7408
- Validation Loss: 0.5661
- Validation Accuracy: 0.0769
- Validation Wermet: 7.1664
- Epoch: 54
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
| 0.0010 | 0.0795 | 8.1006 | 0.5918 | 0.0766 | 7.4447 | 34 |
| 0.0036 | 0.0795 | 8.9171 | 0.5687 | 0.0767 | 7.6962 | 35 |
| 0.0018 | 0.0795 | 8.4062 | 0.5713 | 0.0768 | 7.2127 | 36 |
| 0.0012 | 0.0795 | 8.3370 | 0.5683 | 0.0768 | 7.1040 | 37 |
| 0.0005 | 0.0795 | 7.9931 | 0.5658 | 0.0769 | 6.8043 | 38 |
| 0.0002 | 0.0795 | 7.9500 | 0.5660 | 0.0769 | 7.0891 | 39 |
| 0.0001 | 0.0795 | 8.1912 | 0.5632 | 0.0770 | 7.1929 | 40 |
| 0.0001 | 0.0795 | 8.2484 | 0.5678 | 0.0769 | 7.6993 | 41 |
| 0.0001 | 0.0795 | 8.2925 | 0.5648 | 0.0770 | 7.1917 | 42 |
| 0.0001 | 0.0795 | 7.9155 | 0.5752 | 0.0769 | 6.4900 | 43 |
| 0.0095 | 0.0793 | 8.3244 | 0.5662 | 0.0767 | 6.9524 | 44 |
| 0.0019 | 0.0795 | 7.8491 | 0.5533 | 0.0769 | 6.9541 | 45 |
| 0.0006 | 0.0795 | 8.0596 | 0.5573 | 0.0768 | 6.9489 | 46 |
| 0.0008 | 0.0795 | 8.0277 | 0.5581 | 0.0769 | 6.9081 | 47 |
| 0.0005 | 0.0795 | 7.6084 | 0.5604 | 0.0769 | 6.7158 | 48 |
| 0.0006 | 0.0795 | 8.0561 | 0.5729 | 0.0767 | 7.4189 | 49 |
| 0.0014 | 0.0795 | 8.2875 | 0.5658 | 0.0768 | 7.5768 | 50 |
| 0.0011 | 0.0795 | 8.4376 | 0.5665 | 0.0768 | 7.2469 | 51 |
| 0.0018 | 0.0795 | 8.3093 | 0.5771 | 0.0768 | 7.2637 | 52 |
| 0.0021 | 0.0795 | 7.8370 | 0.5680 | 0.0768 | 7.0030 | 53 |
| 0.0014 | 0.0795 | 7.7408 | 0.5661 | 0.0769 | 7.1664 | 54 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prefix_tuning_500_10_3000_8_e5_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T20:40:08Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T20:40:06Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e0_s55555_v4_l5_v50
|
KingKazma
| 2023-08-13T20:37:26Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T20:04:51Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e-1_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T20:36:50Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:54:53Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0054
|
bigmorning
| 2023-08-13T20:35:53Z | 58 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T20:35:47Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0054
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0054
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0021
- Train Accuracy: 0.0795
- Train Wermet: 7.8370
- Validation Loss: 0.5680
- Validation Accuracy: 0.0768
- Validation Wermet: 7.0030
- Epoch: 53
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
| 0.0010 | 0.0795 | 8.1006 | 0.5918 | 0.0766 | 7.4447 | 34 |
| 0.0036 | 0.0795 | 8.9171 | 0.5687 | 0.0767 | 7.6962 | 35 |
| 0.0018 | 0.0795 | 8.4062 | 0.5713 | 0.0768 | 7.2127 | 36 |
| 0.0012 | 0.0795 | 8.3370 | 0.5683 | 0.0768 | 7.1040 | 37 |
| 0.0005 | 0.0795 | 7.9931 | 0.5658 | 0.0769 | 6.8043 | 38 |
| 0.0002 | 0.0795 | 7.9500 | 0.5660 | 0.0769 | 7.0891 | 39 |
| 0.0001 | 0.0795 | 8.1912 | 0.5632 | 0.0770 | 7.1929 | 40 |
| 0.0001 | 0.0795 | 8.2484 | 0.5678 | 0.0769 | 7.6993 | 41 |
| 0.0001 | 0.0795 | 8.2925 | 0.5648 | 0.0770 | 7.1917 | 42 |
| 0.0001 | 0.0795 | 7.9155 | 0.5752 | 0.0769 | 6.4900 | 43 |
| 0.0095 | 0.0793 | 8.3244 | 0.5662 | 0.0767 | 6.9524 | 44 |
| 0.0019 | 0.0795 | 7.8491 | 0.5533 | 0.0769 | 6.9541 | 45 |
| 0.0006 | 0.0795 | 8.0596 | 0.5573 | 0.0768 | 6.9489 | 46 |
| 0.0008 | 0.0795 | 8.0277 | 0.5581 | 0.0769 | 6.9081 | 47 |
| 0.0005 | 0.0795 | 7.6084 | 0.5604 | 0.0769 | 6.7158 | 48 |
| 0.0006 | 0.0795 | 8.0561 | 0.5729 | 0.0767 | 7.4189 | 49 |
| 0.0014 | 0.0795 | 8.2875 | 0.5658 | 0.0768 | 7.5768 | 50 |
| 0.0011 | 0.0795 | 8.4376 | 0.5665 | 0.0768 | 7.2469 | 51 |
| 0.0018 | 0.0795 | 8.3093 | 0.5771 | 0.0768 | 7.2637 | 52 |
| 0.0021 | 0.0795 | 7.8370 | 0.5680 | 0.0768 | 7.0030 | 53 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_prefix_tuning_500_10_3000_8_e8_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T20:31:39Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T20:31:34Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_prefix_tuning_500_10_3000_8_e7_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T20:24:52Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T20:24:48Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0051
|
bigmorning
| 2023-08-13T20:22:48Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T20:22:41Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0051
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0051
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0014
- Train Accuracy: 0.0795
- Train Wermet: 8.2875
- Validation Loss: 0.5658
- Validation Accuracy: 0.0768
- Validation Wermet: 7.5768
- Epoch: 50
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
| 0.0010 | 0.0795 | 8.1006 | 0.5918 | 0.0766 | 7.4447 | 34 |
| 0.0036 | 0.0795 | 8.9171 | 0.5687 | 0.0767 | 7.6962 | 35 |
| 0.0018 | 0.0795 | 8.4062 | 0.5713 | 0.0768 | 7.2127 | 36 |
| 0.0012 | 0.0795 | 8.3370 | 0.5683 | 0.0768 | 7.1040 | 37 |
| 0.0005 | 0.0795 | 7.9931 | 0.5658 | 0.0769 | 6.8043 | 38 |
| 0.0002 | 0.0795 | 7.9500 | 0.5660 | 0.0769 | 7.0891 | 39 |
| 0.0001 | 0.0795 | 8.1912 | 0.5632 | 0.0770 | 7.1929 | 40 |
| 0.0001 | 0.0795 | 8.2484 | 0.5678 | 0.0769 | 7.6993 | 41 |
| 0.0001 | 0.0795 | 8.2925 | 0.5648 | 0.0770 | 7.1917 | 42 |
| 0.0001 | 0.0795 | 7.9155 | 0.5752 | 0.0769 | 6.4900 | 43 |
| 0.0095 | 0.0793 | 8.3244 | 0.5662 | 0.0767 | 6.9524 | 44 |
| 0.0019 | 0.0795 | 7.8491 | 0.5533 | 0.0769 | 6.9541 | 45 |
| 0.0006 | 0.0795 | 8.0596 | 0.5573 | 0.0768 | 6.9489 | 46 |
| 0.0008 | 0.0795 | 8.0277 | 0.5581 | 0.0769 | 6.9081 | 47 |
| 0.0005 | 0.0795 | 7.6084 | 0.5604 | 0.0769 | 6.7158 | 48 |
| 0.0006 | 0.0795 | 8.0561 | 0.5729 | 0.0767 | 7.4189 | 49 |
| 0.0014 | 0.0795 | 8.2875 | 0.5658 | 0.0768 | 7.5768 | 50 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prefix_tuning_500_10_3000_8_e2_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T20:19:21Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T20:19:20Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0050
|
bigmorning
| 2023-08-13T20:18:38Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T20:18:18Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0050
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0050
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0006
- Train Accuracy: 0.0795
- Train Wermet: 8.0561
- Validation Loss: 0.5729
- Validation Accuracy: 0.0767
- Validation Wermet: 7.4189
- Epoch: 49
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
| 0.0010 | 0.0795 | 8.1006 | 0.5918 | 0.0766 | 7.4447 | 34 |
| 0.0036 | 0.0795 | 8.9171 | 0.5687 | 0.0767 | 7.6962 | 35 |
| 0.0018 | 0.0795 | 8.4062 | 0.5713 | 0.0768 | 7.2127 | 36 |
| 0.0012 | 0.0795 | 8.3370 | 0.5683 | 0.0768 | 7.1040 | 37 |
| 0.0005 | 0.0795 | 7.9931 | 0.5658 | 0.0769 | 6.8043 | 38 |
| 0.0002 | 0.0795 | 7.9500 | 0.5660 | 0.0769 | 7.0891 | 39 |
| 0.0001 | 0.0795 | 8.1912 | 0.5632 | 0.0770 | 7.1929 | 40 |
| 0.0001 | 0.0795 | 8.2484 | 0.5678 | 0.0769 | 7.6993 | 41 |
| 0.0001 | 0.0795 | 8.2925 | 0.5648 | 0.0770 | 7.1917 | 42 |
| 0.0001 | 0.0795 | 7.9155 | 0.5752 | 0.0769 | 6.4900 | 43 |
| 0.0095 | 0.0793 | 8.3244 | 0.5662 | 0.0767 | 6.9524 | 44 |
| 0.0019 | 0.0795 | 7.8491 | 0.5533 | 0.0769 | 6.9541 | 45 |
| 0.0006 | 0.0795 | 8.0596 | 0.5573 | 0.0768 | 6.9489 | 46 |
| 0.0008 | 0.0795 | 8.0277 | 0.5581 | 0.0769 | 6.9081 | 47 |
| 0.0005 | 0.0795 | 7.6084 | 0.5604 | 0.0769 | 6.7158 | 48 |
| 0.0006 | 0.0795 | 8.0561 | 0.5729 | 0.0767 | 7.4189 | 49 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_prefix_tuning_500_10_3000_8_e6_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T20:18:06Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:22:24Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_8_e9_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T20:13:10Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T20:13:09Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prefix_tuning_500_10_3000_8_e0_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T20:05:30Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T20:05:29Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
redstonehero/lofi_v3
|
redstonehero
| 2023-08-13T20:05:07Z | 32 | 0 |
diffusers
|
[
"diffusers",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] |
text-to-image
| 2023-08-13T18:40:18Z |
---
license: creativeml-openrail-m
library_name: diffusers
---
|
redstonehero/m4rv3lsdungeonsv40
|
redstonehero
| 2023-08-13T20:05:01Z | 5 | 0 |
diffusers
|
[
"diffusers",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] |
text-to-image
| 2023-08-13T18:43:05Z |
---
license: creativeml-openrail-m
library_name: diffusers
---
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_8_e8_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T20:04:21Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T20:04:19Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
JapGuy/MichalHruza_V1_1000Epochs_RVC_v2
|
JapGuy
| 2023-08-13T20:04:01Z | 0 | 0 | null |
[
"music",
"rvc",
"michal",
"hruza",
"model",
"audio-to-audio",
"cs",
"license:openrail",
"region:us"
] |
audio-to-audio
| 2023-08-13T19:57:52Z |
---
license: openrail
language:
- cs
pipeline_tag: audio-to-audio
tags:
- music
- rvc
- michal
- hruza
- model
---

# Michal Hrůza [CZ] (v1)
# 1000 Epochs - RVC V2 - mangio-creep - 64 Hop Length
Trained on 14 minutes of isolated acapellas using UVR (Voc FT + Reverb HQ) + Audacity to remove parts with double vocals and vocals from others (+Noise Gate)
|
Ridhto/TomatsuHaruka
|
Ridhto
| 2023-08-13T20:03:25Z | 0 | 1 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-04T07:26:17Z |
---
license: creativeml-openrail-m
---
|
KingKazma/xsum_gpt2_prefix_tuning_500_10_3000_8_e3_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:57:57Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:01:34Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0044
|
bigmorning
| 2023-08-13T19:52:02Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T19:51:54Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0044
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0044
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0001
- Train Accuracy: 0.0795
- Train Wermet: 7.9155
- Validation Loss: 0.5752
- Validation Accuracy: 0.0769
- Validation Wermet: 6.4900
- Epoch: 43
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
| 0.0010 | 0.0795 | 8.1006 | 0.5918 | 0.0766 | 7.4447 | 34 |
| 0.0036 | 0.0795 | 8.9171 | 0.5687 | 0.0767 | 7.6962 | 35 |
| 0.0018 | 0.0795 | 8.4062 | 0.5713 | 0.0768 | 7.2127 | 36 |
| 0.0012 | 0.0795 | 8.3370 | 0.5683 | 0.0768 | 7.1040 | 37 |
| 0.0005 | 0.0795 | 7.9931 | 0.5658 | 0.0769 | 6.8043 | 38 |
| 0.0002 | 0.0795 | 7.9500 | 0.5660 | 0.0769 | 7.0891 | 39 |
| 0.0001 | 0.0795 | 8.1912 | 0.5632 | 0.0770 | 7.1929 | 40 |
| 0.0001 | 0.0795 | 8.2484 | 0.5678 | 0.0769 | 7.6993 | 41 |
| 0.0001 | 0.0795 | 8.2925 | 0.5648 | 0.0770 | 7.1917 | 42 |
| 0.0001 | 0.0795 | 7.9155 | 0.5752 | 0.0769 | 6.4900 | 43 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
s3nh/flozi00-Llama-2-13B-german-assistant-v3-GGML
|
s3nh
| 2023-08-13T19:51:57Z | 0 | 0 |
transformers
|
[
"transformers",
"text-generation",
"zh",
"en",
"license:openrail",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-08-13T19:51:56Z |
---
license: openrail
pipeline_tag: text-generation
library_name: transformers
language:
- zh
- en
---
## Original model card
Buy me a coffee if you like this project ;)
<a href="https://www.buymeacoffee.com/s3nh"><img src="https://www.buymeacoffee.com/assets/img/guidelines/download-assets-sm-1.svg" alt=""></a>
#### Description
GGML Format model files for [This project](https://huggingface.co/Photolens/OpenOrcaxOpenChat-2-13b-langchain-chat).
### inference
```python
import ctransformers
from ctransformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained(output_dir, ggml_file,
gpu_layers=32, model_type="llama")
manual_input: str = "Tell me about your last dream, please."
llm(manual_input,
max_new_tokens=256,
temperature=0.9,
top_p= 0.7)
```
# Original model card
|
KingKazma/xsum_gpt2_prefix_tuning_500_10_3000_8_e2_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:51:14Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:54:36Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
sherif1311/flan-t5-base-tobacco_intent
|
sherif1311
| 2023-08-13T19:51:04Z | 103 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"base_model:google/flan-t5-base",
"base_model:finetune:google/flan-t5-base",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-08-13T17:57:41Z |
---
license: apache-2.0
base_model: google/flan-t5-base
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: flan-t5-base-tobacco_intent
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flan-t5-base-tobacco_intent
This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0003
- F1: 100.0
- Gen Len: 2.3333
## Model description
Use double quotation for any tweet.
0: Anti-tobacco
1: Neutral
2: Pro-tobacco
## Intended uses & limitations
That model has been developed to monitor anti-tobacco and pro-tobacco intent in social media
## Training and evaluation data
The model was developed and fine tuned in STOP, University of Bath, UK
Data used is sherif1311/intend which was collected, augmented and trained by STOP
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
### Framework versions
- Transformers 4.31.0
- Pytorch 1.12.1+cu116
- Datasets 2.14.4
- Tokenizers 0.12.1
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e9_s108_v4_l5_v50
|
KingKazma
| 2023-08-13T19:47:47Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:47:45Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_8_e6_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:46:42Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:46:41Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e8_s108_v4_l5_v50
|
KingKazma
| 2023-08-13T19:40:27Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:40:26Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0041
|
bigmorning
| 2023-08-13T19:38:44Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T19:38:36Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0041
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0041
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0001
- Train Accuracy: 0.0795
- Train Wermet: 8.1912
- Validation Loss: 0.5632
- Validation Accuracy: 0.0770
- Validation Wermet: 7.1929
- Epoch: 40
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
| 0.0010 | 0.0795 | 8.1006 | 0.5918 | 0.0766 | 7.4447 | 34 |
| 0.0036 | 0.0795 | 8.9171 | 0.5687 | 0.0767 | 7.6962 | 35 |
| 0.0018 | 0.0795 | 8.4062 | 0.5713 | 0.0768 | 7.2127 | 36 |
| 0.0012 | 0.0795 | 8.3370 | 0.5683 | 0.0768 | 7.1040 | 37 |
| 0.0005 | 0.0795 | 7.9931 | 0.5658 | 0.0769 | 6.8043 | 38 |
| 0.0002 | 0.0795 | 7.9500 | 0.5660 | 0.0769 | 7.0891 | 39 |
| 0.0001 | 0.0795 | 8.1912 | 0.5632 | 0.0770 | 7.1929 | 40 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_8_e5_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:37:52Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:37:51Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_prefix_tuning_500_10_3000_8_e0_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:37:49Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:40:43Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
shreyasdatar/distilbert-base-uncased-finetuned-imdb
|
shreyasdatar
| 2023-08-13T19:35:06Z | 125 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"fill-mask",
"generated_from_trainer",
"dataset:tweet_eval",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2023-07-19T14:09:37Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
datasets:
- tweet_eval
model-index:
- name: distilbert-base-uncased-finetuned-imdb
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-imdb
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the tweet_eval dataset.
It achieves the following results on the evaluation set:
- Loss: 3.1620
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 3.6538 | 1.0 | 149 | 3.3045 |
| 3.3379 | 2.0 | 298 | 3.1949 |
| 3.2875 | 3.0 | 447 | 3.1166 |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.4
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prefix_tuning_500_10_3000_8_e7_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:33:59Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:33:58Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e7_s108_v4_l5_v50
|
KingKazma
| 2023-08-13T19:33:08Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:33:06Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_prefix_tuning_500_10_3000_8_e-1_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:31:02Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:33:45Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_8_e4_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:29:03Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:29:02Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KnutJaegersberg/Galactica-120B-GPTQ-2-bit-64g
|
KnutJaegersberg
| 2023-08-13T19:27:04Z | 5 | 3 |
transformers
|
[
"transformers",
"opt",
"text-generation",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-08-13T11:23:27Z |
---
license: cc-by-4.0
---
Experimental quantization.
Working inference code (regular inference with autogptq does not work without return_token_type_ids=False, didn't get it to work with textgen-webui):
from auto_gptq import AutoGPTQForCausalLM, BaseQuantizeConfig
from transformers import AutoTokenizer, TextGenerationPipeline
tokenizer = AutoTokenizer.from_pretrained(quantized_model_dir, use_fast=True)
model = AutoGPTQForCausalLM.from_quantized(quantized_model_dir, device="cuda:0", use_triton=False)
input_ids = tokenizer("Question: What is the purpose of life?\n\nAnswer:", return_tensors="pt").input_ids.to("cuda:0")
out = model.generate(input_ids=input_ids,max_length=300)
print(tokenizer.decode(out[0]))
or
print(tokenizer.decode(model.generate(**tokenizer("test is", return_tensors="pt", return_token_type_ids=False).to("cuda:0"))[0]))
|
KingKazma/cnn_dailymail_gpt2_prefix_tuning_500_10_3000_8_e6_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:27:03Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:27:02Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0037
|
bigmorning
| 2023-08-13T19:21:21Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T19:21:12Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0037
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0037
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0018
- Train Accuracy: 0.0795
- Train Wermet: 8.4062
- Validation Loss: 0.5713
- Validation Accuracy: 0.0768
- Validation Wermet: 7.2127
- Epoch: 36
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
| 0.0010 | 0.0795 | 8.1006 | 0.5918 | 0.0766 | 7.4447 | 34 |
| 0.0036 | 0.0795 | 8.9171 | 0.5687 | 0.0767 | 7.6962 | 35 |
| 0.0018 | 0.0795 | 8.4062 | 0.5713 | 0.0768 | 7.2127 | 36 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_8_e3_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:20:14Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:20:13Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prefix_tuning_500_10_3000_8_e5_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:20:07Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:20:06Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e5_s108_v4_l5_v50
|
KingKazma
| 2023-08-13T19:18:29Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:18:26Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prefix_tuning_500_10_3000_8_e4_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:13:12Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:13:11Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_8_e2_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:11:24Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:11:23Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e4_s108_v4_l5_v50
|
KingKazma
| 2023-08-13T19:11:09Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:11:07Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0034
|
bigmorning
| 2023-08-13T19:08:13Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T19:08:05Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0034
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0034
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0005
- Train Accuracy: 0.0795
- Train Wermet: 8.2728
- Validation Loss: 0.5669
- Validation Accuracy: 0.0768
- Validation Wermet: 7.1451
- Epoch: 33
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e3_s108_v4_l5_v50
|
KingKazma
| 2023-08-13T19:03:49Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:03:47Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0033
|
bigmorning
| 2023-08-13T19:03:48Z | 55 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T19:03:41Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0033
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0033
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0009
- Train Accuracy: 0.0795
- Train Wermet: 8.4768
- Validation Loss: 0.5611
- Validation Accuracy: 0.0769
- Validation Wermet: 7.6392
- Epoch: 32
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
bigmorning/whisper_charsplit_new_round2__0032
|
bigmorning
| 2023-08-13T18:59:27Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T18:59:19Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0032
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0032
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0019
- Train Accuracy: 0.0795
- Train Wermet: 8.6037
- Validation Loss: 0.5715
- Validation Accuracy: 0.0767
- Validation Wermet: 7.6157
- Epoch: 31
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_8_e0_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T18:53:46Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T15:46:10Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e1_s108_v4_l5_v50
|
KingKazma
| 2023-08-13T18:49:12Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:50:45Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
mani05/dqn-SpaceInvadersNoFrameskip-v4
|
mani05
| 2023-08-13T18:48:49Z | 7 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-08-13T18:48:14Z |
---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 649.00 +/- 125.61
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga mani05 -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga mani05 -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga mani05
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
# Environment Arguments
```python
{'render_mode': 'rgb_array'}
```
|
Campqt/ppo-Huggy
|
Campqt
| 2023-08-13T18:47:09Z | 0 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"Huggy",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Huggy",
"region:us"
] |
reinforcement-learning
| 2023-08-13T18:47:03Z |
---
library_name: ml-agents
tags:
- Huggy
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: Campqt/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
KingKazma/cnn_dailymail_gpt2_prefix_tuning_500_10_3000_8_e0_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T18:45:31Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:06:06Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e0_s108_v4_l5_v50
|
KingKazma
| 2023-08-13T18:41:53Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:43:19Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0027
|
bigmorning
| 2023-08-13T18:37:27Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T18:37:20Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0027
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0027
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0011
- Train Accuracy: 0.0795
- Train Wermet: 8.4237
- Validation Loss: 0.5710
- Validation Accuracy: 0.0768
- Validation Wermet: 7.4035
- Epoch: 26
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e-1_s108_v4_l5_v50
|
KingKazma
| 2023-08-13T18:34:32Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:35:50Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KemalHal/whisper-base-bosnian-google
|
KemalHal
| 2023-08-13T18:29:56Z | 78 | 0 |
transformers
|
[
"transformers",
"pytorch",
"whisper",
"automatic-speech-recognition",
"bs",
"dataset:google/fleurs",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-11T15:15:09Z |
---
datasets:
- google/fleurs
language:
- bs
metrics:
- wer
pipeline_tag: automatic-speech-recognition
---
Ovaj model je Fine-Tuned verzija Whisper AI base modela na Bosanskom jeziku. Dataset koristen je google/fleurs bs_ba.
|
josephamess/llama-2-7b-ExtraData-v2
|
josephamess
| 2023-08-13T18:24:51Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:36:15Z |
---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.5.0.dev0
|
charliezjw/t2
|
charliezjw
| 2023-08-13T18:20:49Z | 0 | 1 |
diffusers
|
[
"diffusers",
"tensorboard",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"text-to-image",
"lora",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"region:us"
] |
text-to-image
| 2023-08-13T17:48:17Z |
---
license: openrail++
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: a photo of sks dog
tags:
- stable-diffusion-xl
- stable-diffusion-xl-diffusers
- text-to-image
- diffusers
- lora
inference: true
---
# LoRA DreamBooth - charliezjw/t2
These are LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained on a photo of sks dog using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following.




LoRA for the text encoder was enabled: False.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
|
Jiuzhouh/alpaca-lora-amr-parsing-llama-7b
|
Jiuzhouh
| 2023-08-13T18:19:45Z | 4 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:19:36Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.4.0.dev0
- PEFT 0.4.0.dev0
|
CyberHarem/kirsten_arknights
|
CyberHarem
| 2023-08-13T18:18:54Z | 0 | 0 | null |
[
"art",
"text-to-image",
"dataset:CyberHarem/kirsten_arknights",
"license:mit",
"region:us"
] |
text-to-image
| 2023-08-13T18:15:08Z |
---
license: mit
datasets:
- CyberHarem/kirsten_arknights
pipeline_tag: text-to-image
tags:
- art
---
# Lora of kirsten_arknights
This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs).
After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora.
For example, if you want to use the model from step 1500, you need to download `1500/kirsten_arknights.pt` as the embedding and `1500/kirsten_arknights.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters.
**The trigger word is `kirsten_arknights`.**
These are available steps:
| Steps | bikini | free | nude | Download |
|--------:|:-----------------------------------------|:-------------------------------------|:-----------------------------------------------|:---------------------------------------|
| 1500 |  |  | [<NSFW, click to see>](1500/previews/nude.png) | [Download](1500/kirsten_arknights.zip) |
| 1400 |  |  | [<NSFW, click to see>](1400/previews/nude.png) | [Download](1400/kirsten_arknights.zip) |
| 1300 |  |  | [<NSFW, click to see>](1300/previews/nude.png) | [Download](1300/kirsten_arknights.zip) |
| 1200 |  |  | [<NSFW, click to see>](1200/previews/nude.png) | [Download](1200/kirsten_arknights.zip) |
| 1100 |  |  | [<NSFW, click to see>](1100/previews/nude.png) | [Download](1100/kirsten_arknights.zip) |
| 1000 |  |  | [<NSFW, click to see>](1000/previews/nude.png) | [Download](1000/kirsten_arknights.zip) |
| 900 |  |  | [<NSFW, click to see>](900/previews/nude.png) | [Download](900/kirsten_arknights.zip) |
| 800 |  |  | [<NSFW, click to see>](800/previews/nude.png) | [Download](800/kirsten_arknights.zip) |
| 700 |  |  | [<NSFW, click to see>](700/previews/nude.png) | [Download](700/kirsten_arknights.zip) |
| 600 |  |  | [<NSFW, click to see>](600/previews/nude.png) | [Download](600/kirsten_arknights.zip) |
| 500 |  |  | [<NSFW, click to see>](500/previews/nude.png) | [Download](500/kirsten_arknights.zip) |
| 400 |  |  | [<NSFW, click to see>](400/previews/nude.png) | [Download](400/kirsten_arknights.zip) |
| 300 |  |  | [<NSFW, click to see>](300/previews/nude.png) | [Download](300/kirsten_arknights.zip) |
| 200 |  |  | [<NSFW, click to see>](200/previews/nude.png) | [Download](200/kirsten_arknights.zip) |
| 100 |  |  | [<NSFW, click to see>](100/previews/nude.png) | [Download](100/kirsten_arknights.zip) |
|
bigmorning/whisper_charsplit_new_round2__0021
|
bigmorning
| 2023-08-13T18:11:16Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T18:11:08Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0021
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0021
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0058
- Train Accuracy: 0.0794
- Train Wermet: 8.8460
- Validation Loss: 0.5706
- Validation Accuracy: 0.0766
- Validation Wermet: 7.4342
- Epoch: 20
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e6_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T18:01:56Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:01:55Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e5_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T17:53:21Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:53:20Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e9_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T17:42:02Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:41:58Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0013
|
bigmorning
| 2023-08-13T17:35:54Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:35:47Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0013
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0013
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0005
- Train Accuracy: 0.0795
- Train Wermet: 9.2292
- Validation Loss: 0.5687
- Validation Accuracy: 0.0767
- Validation Wermet: 8.5576
- Epoch: 12
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
josephamess/llama-2-7b-ExtraData
|
josephamess
| 2023-08-13T17:35:06Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-12T19:49:17Z |
---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e8_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T17:33:23Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:33:19Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e9_s55555_v4_l4_r4
|
KingKazma
| 2023-08-13T17:29:14Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:29:12Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
LarryAIDraw/raiden_shogun
|
LarryAIDraw
| 2023-08-13T17:27:33Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-08-13T17:22:03Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/127435/raiden-shogun-my-birthday-special-or-goofy-ai
|
bigmorning/whisper_charsplit_new_round2__0011
|
bigmorning
| 2023-08-13T17:27:17Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:27:08Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0011
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0011
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0005
- Train Accuracy: 0.0795
- Train Wermet: 9.3749
- Validation Loss: 0.5552
- Validation Accuracy: 0.0768
- Validation Wermet: 8.0800
- Epoch: 10
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
bigmorning/whisper_charsplit_new_round2__0010
|
bigmorning
| 2023-08-13T17:22:53Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:22:45Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0010
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0010
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0011
- Train Accuracy: 0.0795
- Train Wermet: 8.9730
- Validation Loss: 0.5605
- Validation Accuracy: 0.0767
- Validation Wermet: 8.3958
- Epoch: 9
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e6_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T17:16:05Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:16:02Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e7_s55555_v4_l4_r4
|
KingKazma
| 2023-08-13T17:15:30Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:15:28Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0008
|
bigmorning
| 2023-08-13T17:14:11Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:14:03Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0008
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0008
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0037
- Train Accuracy: 0.0795
- Train Wermet: 9.3428
- Validation Loss: 0.5717
- Validation Accuracy: 0.0764
- Validation Wermet: 8.2631
- Epoch: 7
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e0_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T17:10:36Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:10:35Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0007
|
bigmorning
| 2023-08-13T17:09:49Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:09:41Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0007
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0007
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0009
- Train Accuracy: 0.0795
- Train Wermet: 8.7510
- Validation Loss: 0.5642
- Validation Accuracy: 0.0766
- Validation Wermet: 7.9083
- Epoch: 6
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e5_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T17:07:27Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:07:23Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0006
|
bigmorning
| 2023-08-13T17:05:24Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:05:17Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0006
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0006
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0012
- Train Accuracy: 0.0795
- Train Wermet: 8.8862
- Validation Loss: 0.5667
- Validation Accuracy: 0.0767
- Validation Wermet: 8.2913
- Epoch: 5
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
jiaqixuac/controlnet_training
|
jiaqixuac
| 2023-08-13T17:05:05Z | 0 | 0 |
diffusers
|
[
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"controlnet",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:adapter:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"region:us"
] |
text-to-image
| 2023-08-13T14:21:44Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- controlnet
inference: true
---
# controlnet-jiaqixuac/controlnet_training
These are controlnet weights trained on runwayml/stable-diffusion-v1-5 with new type of conditioning.
You can find some example images below.
prompt: red circle with blue background

prompt: cyan circle with brown floral background

|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e3_s55555_v4_l4_r2
|
KingKazma
| 2023-08-13T17:02:17Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:02:15Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
DveloperY0115/pokemon-lora
|
DveloperY0115
| 2023-08-13T16:55:41Z | 4 | 0 |
diffusers
|
[
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"lora",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:adapter:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"region:us"
] |
text-to-image
| 2023-08-13T13:46:41Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- lora
inference: true
---
# LoRA text2image fine-tuning - DveloperY0115/pokemon-lora
These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were fine-tuned on the lambdalabs/pokemon-blip-captions dataset. You can find some example images in the following.




|
bigmorning/whisper_charsplit_new_round2__0003
|
bigmorning
| 2023-08-13T16:52:03Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T16:51:56Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0003
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0003
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0025
- Train Accuracy: 0.0795
- Train Wermet: 8.7338
- Validation Loss: 0.5673
- Validation Accuracy: 0.0765
- Validation Wermet: 8.3770
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
jsnbuchanan/segformer-b0-scene-parse-150
|
jsnbuchanan
| 2023-08-13T16:50:58Z | 31 | 0 |
transformers
|
[
"transformers",
"pytorch",
"segformer",
"generated_from_trainer",
"dataset:scene_parse_150",
"base_model:nvidia/mit-b0",
"base_model:finetune:nvidia/mit-b0",
"license:other",
"endpoints_compatible",
"region:us"
] | null | 2023-08-01T20:37:16Z |
---
license: other
base_model: nvidia/mit-b0
tags:
- generated_from_trainer
datasets:
- scene_parse_150
model-index:
- name: segformer-b0-scene-parse-150
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-scene-parse-150
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset.
It achieves the following results on the evaluation set:
- Loss: 4.5716
- Mean Iou: 0.0039
- Mean Accuracy: 0.0219
- Overall Accuracy: 0.1398
- Per Category Iou: [0.1424604255351693, 0.0028172808510882213, 0.009342676914231785, 0.0, 0.0, 0.02331811292704824, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan]
- Per Category Accuracy: [0.9514506078251098, 0.0028769356391743226, 0.00966095515858549, 0.0, 0.0, 0.045009037210949, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan]
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
| 4.8633 | 1.0 | 20 | 4.8626 | 0.0009 | 0.0023 | 0.0067 | [0.0, 0.0004889263991152761, 0.0, 0.0, 0.0, 0.0, 0.03284478144986514, 0.0, 0.0, 0.014472940861907617, 0.0, 0.0009606283639651349, 0.0, 0.001090056864633105, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0033905507210453163, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.011123126834543489, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan] | [0.0, 0.0004916838121884905, 0.0, 0.0, 0.0, 0.0, 0.05156049842785606, 0.0, 0.0, 0.02758031245634076, 0.0, 0.002084802403654536, 0.0, 0.0011670427137633237, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.003448710560437977, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.02041973908111174, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.8081 | 2.0 | 40 | 4.6105 | 0.0014 | 0.0050 | 0.0207 | [0.0014870097866647892, 0.00010797969981643452, 0.0, 0.0, 0.0, 0.005608097195054107, 0.06877789289425044, 0.0, 0.0, 0.012758644335110486, 0.0, 0.0, 0.0, 0.0023358985966500678, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0030744981206588954, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan] | [0.0014982803827425341, 0.0001082875062557985, 0.0, 0.0, 0.0, 0.005822945199098142, 0.19299522534063118, 0.0, 0.0, 0.024465610900633625, 0.0, 0.0, 0.0, 0.002509141834591146, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.00319658550641118, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.6722 | 3.0 | 60 | 4.3976 | 0.0024 | 0.0122 | 0.0530 | [0.025172656541163803, 0.0010250959756997078, 0.0, 0.00034731034851257663, 0.0, 0.01513758223102497, 0.08697308653455893, 0.0, 0.0, 0.018849001504380403, 0.0, 0.0, 0.0, 0.002455752600466593, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0010695187165775401, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan] | [0.031049590810546986, 0.001033121343467483, 0.0, 0.0003484413948377067, 0.0, 0.01993776436171204, 0.465229832471011, 0.0, 0.0, 0.042011489057453506, 0.0, 0.0, 0.0, 0.0027230996654477556, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0010805359458291313, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.3876 | 4.0 | 80 | 4.1595 | 0.0032 | 0.0120 | 0.0594 | [0.07854437258265586, 0.002560575118273506, 0.0, 0.0, 0.0, 0.001192829675600377, 0.0810610478529872, 0.0, 0.0, 0.014937473673721733, 0.0, 0.0, 0.0, 0.005779474740910093, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan] | [0.16734486555203687, 0.0026310937330800773, 0.0, 0.0, 0.0, 0.001332289861553655, 0.3434053136801477, 0.0, 0.0, 0.027103656281588746, 0.0, 0.0, 0.0, 0.009686454524235588, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.5927 | 5.0 | 100 | 4.1201 | 0.0038 | 0.0139 | 0.0738 | [0.10272745389575265, 5.808145343029064e-06, 0.0011583054074428688, 0.0, 0.0, 0.004795549066148129, 0.07399210984610646, 0.0, 0.0, 0.021106484070283166, 0.0, 0.0, 0.0, 0.004504373601464648, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.31741694191893394, 5.853378716529649e-06, 0.0011806276115426681, 0.0, 0.0, 0.00645648163676002, 0.2597614333959973, 0.0, 0.0, 0.0499420616201379, 0.0, 0.0, 0.0, 0.006029720687777173, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6357 | 6.0 | 120 | 3.8936 | 0.0031 | 0.0183 | 0.1193 | [0.13782355615369993, 0.00012947106753210882, 0.0020724837921139334, 0.0, 0.0, 0.002555414284014255, 0.00625454345947577, 0.0, 0.0, 0.0021069049880189433, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.8250104993132882, 0.0001317010211219171, 0.002187386073641998, 0.0, 0.0, 0.0038012186259712673, 0.0075862183699612375, 0.0, 0.0, 0.0027531003196883653, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.9752 | 7.0 | 140 | 3.7886 | 0.0046 | 0.0208 | 0.1099 | [0.12970556062622363, 0.02142529574722971, 0.015682682019160826, 0.0, 0.0, 0.046496050677108665, 0.011263870769586333, 0.0, 0.0, 7.157683773530885e-05, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.6339173221643342, 0.026799694453630996, 0.017715022855380128, 0.0, 0.0, 0.2645107794361526, 0.01540534695303532, 0.0, 0.0, 8.218209909517509e-05, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.3363 | 8.0 | 160 | 3.7281 | 0.0044 | 0.0211 | 0.1287 | [0.13844309203854324, 0.006048050986190329, 0.03636386971467197, 0.0, 0.0, 0.029408017788998847, 0.00848082968593638, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.8096133982588166, 0.006476763549840056, 0.047673799040915336, 0.0, 0.0, 0.09462984701958373, 0.010468482257232695, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.0164 | 9.0 | 180 | 3.7593 | 0.0058 | 0.0207 | 0.1113 | [0.1257575586728653, 0.031513436815826704, 0.04777990089072147, 0.0, 1.3152702880441932e-05, 0.04167081950119985, 0.042454800300151495, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.5380103517553717, 0.04309550080044954, 0.06794918533890462, 0.0, 1.6288501946475982e-05, 0.20255464251774832, 0.09977291254221497, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.5911 | 10.0 | 200 | 3.4747 | 0.0045 | 0.0213 | 0.1331 | [0.1420633887795078, 0.015648167497586848, 0.018910536514932744, 0.0, 0.0, 0.024232767348835664, 0.007368267428066774, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.8642552297930784, 0.018408876063485746, 0.022330968338988753, 0.0, 0.0, 0.0680119999254663, 0.008189289457485567, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6199 | 11.0 | 220 | 3.5161 | 0.0043 | 0.0210 | 0.1347 | [0.14182326280462046, 0.005032394655020479, 0.028134080245829304, 0.0, 0.0, 0.014572871324844493, 0.013168143969916734, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.8793316761444252, 0.005314867874608921, 0.037280910849995796, 0.0, 0.0, 0.029226526543313397, 0.016312033139796036, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6997 | 12.0 | 240 | 3.4768 | 0.0032 | 0.0217 | 0.1424 | [0.14365582667494992, 0.0001899779333323591, 0.00036495363737125246, 0.0, 0.0, 0.0013590921597892156, 0.0009173402092527396, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9941061962974314, 0.0001902348082872136, 0.0003785860512072689, 0.0, 0.0, 0.0015186241079247955, 0.0009233226305544927, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.8775 | 13.0 | 260 | 3.4909 | 0.0034 | 0.0215 | 0.1412 | [0.14366726677393113, 0.0, 0.00935392023848197, 0.0, 0.0, 0.002252103613265138, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9757607745655554, 0.0, 0.01098179982613085, 0.0, 0.0, 0.0025434624629660685, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6896 | 14.0 | 280 | 3.4944 | 0.0032 | 0.0216 | 0.1419 | [0.14372170627530323, 0.0, 0.0026556544494494754, 0.0, 0.0, 0.0026339040125360493, 5.807995950997109e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9887260076503105, 0.0, 0.002880058330295297, 0.0, 0.0, 0.002944081092664021, 5.822755327821125e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.5723 | 15.0 | 300 | 3.4798 | 0.0033 | 0.0216 | 0.1420 | [0.14360235216056103, 0.0, 0.005107262858619798, 0.0, 0.0, 0.0031149607580008946, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.986302652637314, 0.0, 0.005541378053226394, 0.0, 0.0, 0.0035683008180073415, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7581 | 16.0 | 320 | 3.5572 | 0.0037 | 0.0217 | 0.1428 | [0.14338450814233764, 0.0, 0.023961774556760896, 0.0, 0.0, 0.0016283550899666187, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9689248703192926, 0.0, 0.029392299279284332, 0.0, 0.0, 0.001677008217340265, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.5605 | 17.0 | 340 | 3.4733 | 0.0035 | 0.0217 | 0.1426 | [0.14333996536268165, 5.851409311932779e-06, 0.015588103385475387, 0.0, 0.0, 0.0036605352700892673, 2.4902258634858183e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9784338429756756, 5.853378716529649e-06, 0.017642109986258727, 0.0, 0.0, 0.004099353420165092, 2.4954665690661966e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7163 | 18.0 | 360 | 3.5516 | 0.0036 | 0.0217 | 0.1420 | [0.1431641368804744, 1.7551198308064484e-05, 0.017346049550404807, 0.0, 0.0, 0.006382572138352761, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9707466430574, 1.7560136149588946e-05, 0.019983734821503688, 0.0, 0.0, 0.007779454785995118, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.2231 | 19.0 | 380 | 3.5751 | 0.0051 | 0.0222 | 0.1352 | [0.14144860113740007, 0.003382545372614381, 0.05970141064540731, 0.0, 0.0, 0.03208762207850171, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.8172324945233311, 0.003427153238528109, 0.0915421071819176, 0.0, 0.0, 0.1078409450872976, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7188 | 20.0 | 400 | 3.4891 | 0.0042 | 0.0218 | 0.1394 | [0.14279628860196056, 0.00783605754787311, 0.023641167287138224, 0.0, 0.0, 0.019599678095523283, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9274525828310689, 0.00824448392223201, 0.02760312964468998, 0.0, 0.0, 0.03902770790243539, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6156 | 21.0 | 420 | 3.5195 | 0.0039 | 0.0218 | 0.1430 | [0.1431243377907112, 0.003771829965220418, 0.028600253499599246, 0.0, 0.0, 0.003834416457157314, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9602076026378815, 0.003869083331626098, 0.03442328724866093, 0.0, 0.0, 0.004518605474500158, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7812 | 22.0 | 440 | 3.5044 | 0.0034 | 0.0218 | 0.1431 | [0.14375063548866288, 0.003941135854106936, 0.008966284779050737, 0.0, 0.0, 0.0008825520320117902, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9869070725644431, 0.004059318139913311, 0.00960206399506436, 0.0, 0.0, 0.0009596213688113738, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.9414 | 23.0 | 460 | 3.6536 | 0.0031 | 0.0217 | 0.1430 | [0.1433341827741543, 0.0009574223513012504, 1.945649666321082e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9991458666757471, 0.0009628807988691273, 1.9630387840376903e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.3775 | 24.0 | 480 | 3.5794 | 0.0031 | 0.0217 | 0.1429 | [0.14339061919565174, 0.0004489089181295073, 0.0005344750160480256, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9989443933667042, 0.00045071016117278296, 0.0005440421772904456, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7282 | 25.0 | 500 | 4.1077 | 0.0031 | 0.0217 | 0.1430 | [0.14301211156076965, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9999744611298396, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.2262 | 26.0 | 520 | 4.1736 | 0.0031 | 0.0217 | 0.1430 | [0.1429770208690924, 0.00013157471543312935, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9998581173879979, 0.0001317010211219171, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.3175 | 27.0 | 540 | 4.0696 | 0.0031 | 0.0217 | 0.1430 | [0.143028637658645, 0.0009629118921073225, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9990578994563059, 0.0009687341775856569, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.8936 | 28.0 | 560 | 4.5036 | 0.0031 | 0.0217 | 0.1430 | [0.14297076727730323, 0.0002453013117779673, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9997531242551163, 0.00024584190609424527, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7712 | 29.0 | 580 | 4.0164 | 0.0031 | 0.0217 | 0.1429 | [0.14297121162029977, 0.0007422241109319424, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9989103415398236, 0.0007463057863575303, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.0557 | 30.0 | 600 | 4.2748 | 0.0031 | 0.0217 | 0.1429 | [0.14303412843917407, 0.0003856052395572551, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9988961532786234, 0.00038632299529095683, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.3498 | 31.0 | 620 | 4.4146 | 0.0031 | 0.0216 | 0.1423 | [0.14290109320524125, 0.0001608060182383262, 0.0, 0.0, 0.0, 0.0005434010215939206, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9951334264083268, 0.00016096791470456535, 0.0, 0.0, 0.0, 0.0006055863007062068, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6119 | 32.0 | 640 | 4.1771 | 0.0031 | 0.0217 | 0.1425 | [0.14301796804098232, 0.00011991109031352363, 0.0, 0.0, 0.0, 2.5596614421132564e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9967650764463514, 0.00011999426368885781, 0.0, 0.0, 0.0, 2.7950136955671083e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.2928 | 33.0 | 660 | 4.2533 | 0.0031 | 0.0217 | 0.1426 | [0.1430651199052239, 0.0, 0.0, 0.0, 0.0, 0.0005383649088625119, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9971878866301177, 0.0, 0.0, 0.0, 0.0, 0.0005869528760690928, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.9388 | 34.0 | 680 | 4.4678 | 0.0031 | 0.0217 | 0.1424 | [0.14302521070021848, 8.777165326686094e-06, 0.0, 0.0, 0.0, 0.00013477429517255322, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9962911885222642, 8.780068074794473e-06, 0.0, 0.0, 0.0, 0.00014906739709691244, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.0637 | 35.0 | 700 | 4.3135 | 0.0032 | 0.0216 | 0.1418 | [0.1429503635871641, 0.00015791597700275475, 1.398789208061502e-05, 0.0, 0.0, 0.0028473543966501713, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9908826233527429, 0.00015804122534630053, 1.4021705600269217e-05, 0.0, 0.0, 0.0034844504071403284, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.8097 | 36.0 | 720 | 4.5076 | 0.0033 | 0.0215 | 0.1403 | [0.14292913158156023, 0.0009571919257359143, 6.428765090128491e-05, 0.0, 0.0, 0.008040658872532778, 0.00016170025747656383, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9762885778822034, 0.0009599541095108625, 6.44998457612384e-05, 0.0, 0.0, 0.012661412040919001, 0.00016220532698930278, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.5005 | 37.0 | 740 | 4.9189 | 0.0032 | 0.0216 | 0.1416 | [0.14321751517531003, 0.0011148109769586326, 3.077732326821598e-05, 0.0, 0.0, 0.002581131431496914, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9885330472979875, 0.0011209220242154277, 3.0847752320592275e-05, 0.0, 0.0, 0.003381966571636201, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.3499 | 38.0 | 760 | 4.8759 | 0.0037 | 0.0216 | 0.1383 | [0.14275769206892197, 0.00015486525262029086, 0.008065831982682663, 0.0, 0.0, 0.018616346885301328, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9461782499631105, 0.0001551145359880357, 0.0083176757620797, 0.0, 0.0, 0.041040117763243705, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.0865 | 39.0 | 780 | 3.8349 | 0.0041 | 0.0219 | 0.1405 | [0.14317728070866245, 0.0019387900438625815, 0.0272595934506007, 0.0, 0.0, 0.01595177221476263, 0.0019825804723817746, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9389535873599618, 0.001952101801962638, 0.030160688746179085, 0.0, 0.0, 0.03231967503307433, 0.0020629190304280558, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.4726 | 40.0 | 800 | 3.9250 | 0.0037 | 0.0216 | 0.1400 | [0.14244053050451713, 0.0022596513184196345, 0.012475440730295781, 0.0, 0.0, 0.013494958163484231, 0.000710175687780943, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9559227477554171, 0.002282817699446563, 0.013087860007291287, 0.0, 0.0, 0.02344084819348948, 0.0007320035269260843, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.0128 | 41.0 | 820 | 4.4055 | 0.0033 | 0.0215 | 0.1403 | [0.14267630712423943, 0.0006876557059397718, 0.002249988165285022, 0.0, 0.0, 0.008327223523661356, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9745434217545771, 0.0006906986885504985, 0.0022659076250035053, 0.0, 0.0, 0.013015447109024168, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.1359 | 42.0 | 840 | 4.3485 | 0.0032 | 0.0215 | 0.1409 | [0.1428689296917672, 0.00043785776628320057, 0.000735403087574484, 0.0, 0.0, 0.004521202653144983, 4.1517204729639965e-06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9827243731626202, 0.00043900340373972366, 0.0007375417145741608, 0.0, 0.0, 0.00636331451357445, 4.159110948443661e-06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.3446 | 43.0 | 860 | 4.1294 | 0.0033 | 0.0216 | 0.1415 | [0.14297466923456498, 0.0010833321199237007, 0.0038991084994498782, 0.0, 0.0, 0.0033002377289889527, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9833543319599096, 0.0010887284412745147, 0.003965338343756134, 0.0, 0.0, 0.004397488214358917, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7443 | 44.0 | 880 | 4.2309 | 0.0034 | 0.0215 | 0.1398 | [0.14257015931769873, 0.0010936527419852125, 0.004274154927043165, 0.0, 0.0, 0.010032222702056273, 4.527177469472419e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9670633704498246, 0.001100435198707574, 0.004329902689363134, 0.0, 0.0, 0.016881882721225334, 4.575022043288027e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.8209 | 45.0 | 900 | 4.4797 | 0.0033 | 0.0215 | 0.1407 | [0.14293897407373585, 0.001303224682901344, 0.002799500704740247, 0.0, 0.0, 0.006579395289821434, 2.0683977777134275e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9771739253810967, 0.0013140835218609062, 0.0028239715078942204, 0.0, 0.0, 0.00990366519462612, 2.0795554742218304e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.9059 | 46.0 | 920 | 4.5053 | 0.0040 | 0.0220 | 0.1364 | [0.14228787838186308, 0.0013217808522283945, 0.007278362518756201, 0.0, 0.0, 0.030787987593700387, 6.171544243800683e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9191155605498235, 0.0013287169686522302, 0.007467960402703385, 0.0, 0.0, 0.0856392196321762, 6.238666422665492e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.2262 | 47.0 | 940 | 5.3410 | 0.0039 | 0.0219 | 0.1365 | [0.1423258178089665, 0.003478698692055064, 0.0075020785926833535, 0.0, 0.0, 0.02704553235193427, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9217006617405024, 0.0035588542596500265, 0.007692307692307693, 0.0, 0.0, 0.07286600704343452, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.7618 | 48.0 | 960 | 5.0946 | 0.0038 | 0.0218 | 0.1373 | [0.1426949704126623, 0.0032438673758399044, 0.0051523797709134515, 0.0, 0.0, 0.023948251838033993, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.93417498098773, 0.0033217924216305756, 0.005241313553380633, 0.0, 0.0, 0.05908658952428867, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7284 | 49.0 | 980 | 4.0830 | 0.0043 | 0.0221 | 0.1373 | [0.14283896315255537, 0.0034238556212341873, 0.024018439121225456, 0.0, 0.0, 0.029172219275206687, 0.0002549698559831555, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9045640798628847, 0.0035091005405595245, 0.026329958776185537, 0.0, 0.0, 0.08402742840106583, 0.00025786487880350697, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6351 | 50.0 | 1000 | 4.5716 | 0.0039 | 0.0219 | 0.1398 | [0.1424604255351693, 0.0028172808510882213, 0.009342676914231785, 0.0, 0.0, 0.02331811292704824, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9514506078251098, 0.0028769356391743226, 0.00966095515858549, 0.0, 0.0, 0.045009037210949, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.1.0.dev20230812
- Datasets 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e3_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:50:09Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:50:05Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e3_s55555_v4_l4_r4
|
KingKazma
| 2023-08-13T16:48:01Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:48:00Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0002
|
bigmorning
| 2023-08-13T16:47:41Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T16:47:33Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0002
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0002
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0013
- Train Accuracy: 0.0795
- Train Wermet: 8.9468
- Validation Loss: 0.5652
- Validation Accuracy: 0.0766
- Validation Wermet: 8.3360
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e1_s55555_v4_l4_r2
|
KingKazma
| 2023-08-13T16:47:35Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:47:33Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e2_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:41:32Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:41:28Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e2_s55555_v4_l4_r4
|
KingKazma
| 2023-08-13T16:41:09Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:41:07Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e0_s55555_v4_l4_r2
|
KingKazma
| 2023-08-13T16:40:15Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:40:14Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e1_s55555_v4_l4_r4
|
KingKazma
| 2023-08-13T16:34:17Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:34:15Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e1_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:32:55Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:32:49Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
digicazter/wav2vec2-base-timit-demo-google-colab
|
digicazter
| 2023-08-13T16:32:43Z | 105 | 0 |
transformers
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:facebook/wav2vec2-base",
"base_model:finetune:facebook/wav2vec2-base",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-07T03:20:25Z |
---
license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-base-timit-demo-google-colab
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-timit-demo-google-colab
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.0392
- Wer: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 400
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:---:|
| 5.2993 | 8.0 | 200 | 3.0327 | 1.0 |
| 3.0806 | 16.0 | 400 | 3.0476 | 1.0 |
| 3.0219 | 24.0 | 600 | 3.0472 | 1.0 |
| 3.0179 | 32.0 | 800 | 3.0435 | 1.0 |
| 3.0157 | 40.0 | 1000 | 3.0546 | 1.0 |
| 3.0146 | 48.0 | 1200 | 3.0484 | 1.0 |
| 3.0139 | 56.0 | 1400 | 3.0344 | 1.0 |
| 3.0118 | 64.0 | 1600 | 3.0351 | 1.0 |
| 3.0114 | 72.0 | 1800 | 3.0559 | 1.0 |
| 3.0114 | 80.0 | 2000 | 3.0526 | 1.0 |
| 3.0108 | 88.0 | 2200 | 3.0417 | 1.0 |
| 3.0092 | 96.0 | 2400 | 3.0629 | 1.0 |
| 3.0089 | 104.0 | 2600 | 3.0352 | 1.0 |
| 3.0083 | 112.0 | 2800 | 3.0503 | 1.0 |
| 3.0078 | 120.0 | 3000 | 3.0529 | 1.0 |
| 3.0072 | 128.0 | 3200 | 3.0378 | 1.0 |
| 3.0068 | 136.0 | 3400 | 3.0481 | 1.0 |
| 3.0063 | 144.0 | 3600 | 3.0392 | 1.0 |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.3
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e6_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:29:52Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:29:51Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e0_s55555_v4_l4_r4
|
KingKazma
| 2023-08-13T16:27:25Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:27:23Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_0099
|
bigmorning
| 2023-08-13T16:27:10Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T16:27:03Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_0099
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_0099
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0022
- Train Accuracy: 0.0795
- Train Wermet: 8.7907
- Validation Loss: 0.5556
- Validation Accuracy: 0.0766
- Validation Wermet: 7.7851
- Epoch: 98
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.8733 | 0.0602 | 13.0686 | 0.6470 | 0.0676 | 11.4066 | 0 |
| 0.5740 | 0.0666 | 12.7778 | 0.5113 | 0.0706 | 11.1022 | 1 |
| 0.4553 | 0.0692 | 12.2404 | 0.4371 | 0.0723 | 10.9105 | 2 |
| 0.3813 | 0.0708 | 11.9157 | 0.3935 | 0.0733 | 9.4615 | 3 |
| 0.3292 | 0.0720 | 11.5732 | 0.3630 | 0.0740 | 9.9885 | 4 |
| 0.2886 | 0.0729 | 11.5171 | 0.3403 | 0.0745 | 9.8042 | 5 |
| 0.2561 | 0.0736 | 11.3173 | 0.3256 | 0.0749 | 9.9431 | 6 |
| 0.2282 | 0.0743 | 11.7308 | 0.3159 | 0.0752 | 9.2086 | 7 |
| 0.2036 | 0.0748 | 11.4503 | 0.3071 | 0.0754 | 9.5236 | 8 |
| 0.1820 | 0.0754 | 11.7175 | 0.3005 | 0.0756 | 10.0755 | 9 |
| 0.1628 | 0.0758 | 11.7056 | 0.2993 | 0.0757 | 9.9497 | 10 |
| 0.1450 | 0.0762 | 11.7637 | 0.2971 | 0.0758 | 10.1481 | 11 |
| 0.1287 | 0.0766 | 11.8509 | 0.3029 | 0.0759 | 10.2042 | 12 |
| 0.1140 | 0.0770 | 12.1100 | 0.3004 | 0.0760 | 10.3873 | 13 |
| 0.0998 | 0.0773 | 11.9502 | 0.3025 | 0.0761 | 10.7066 | 14 |
| 0.0872 | 0.0777 | 12.3196 | 0.3129 | 0.0759 | 10.7707 | 15 |
| 0.0760 | 0.0779 | 12.2637 | 0.3142 | 0.0761 | 10.2638 | 16 |
| 0.0651 | 0.0782 | 12.1215 | 0.3192 | 0.0761 | 10.0750 | 17 |
| 0.0547 | 0.0785 | 12.0551 | 0.3294 | 0.0761 | 10.4732 | 18 |
| 0.0463 | 0.0787 | 11.9677 | 0.3402 | 0.0760 | 10.2814 | 19 |
| 0.0386 | 0.0789 | 11.6855 | 0.3517 | 0.0760 | 10.0599 | 20 |
| 0.0318 | 0.0790 | 11.6314 | 0.3628 | 0.0760 | 9.6652 | 21 |
| 0.0262 | 0.0792 | 11.4603 | 0.3728 | 0.0760 | 10.0035 | 22 |
| 0.0224 | 0.0792 | 11.4330 | 0.3824 | 0.0760 | 9.1995 | 23 |
| 0.0181 | 0.0793 | 11.3124 | 0.3982 | 0.0759 | 9.8710 | 24 |
| 0.0142 | 0.0794 | 11.3562 | 0.4057 | 0.0760 | 9.6831 | 25 |
| 0.0118 | 0.0794 | 11.0532 | 0.4207 | 0.0759 | 9.7227 | 26 |
| 0.0101 | 0.0794 | 11.2963 | 0.4282 | 0.0760 | 9.5792 | 27 |
| 0.0114 | 0.0794 | 11.3093 | 0.4431 | 0.0758 | 9.5545 | 28 |
| 0.0109 | 0.0794 | 11.4214 | 0.4419 | 0.0760 | 9.4377 | 29 |
| 0.0084 | 0.0794 | 10.9143 | 0.4474 | 0.0760 | 9.3668 | 30 |
| 0.0043 | 0.0795 | 10.9497 | 0.4525 | 0.0761 | 9.3202 | 31 |
| 0.0036 | 0.0795 | 10.7759 | 0.4667 | 0.0761 | 9.0385 | 32 |
| 0.0047 | 0.0795 | 10.7613 | 0.4788 | 0.0759 | 9.4065 | 33 |
| 0.0130 | 0.0793 | 11.1022 | 0.4748 | 0.0760 | 9.4521 | 34 |
| 0.0074 | 0.0794 | 10.9738 | 0.4730 | 0.0760 | 9.3348 | 35 |
| 0.0032 | 0.0795 | 10.6370 | 0.4750 | 0.0762 | 8.8298 | 36 |
| 0.0020 | 0.0795 | 10.7428 | 0.4835 | 0.0762 | 9.0566 | 37 |
| 0.0014 | 0.0795 | 10.6908 | 0.4937 | 0.0761 | 9.2445 | 38 |
| 0.0035 | 0.0795 | 10.6833 | 0.5276 | 0.0757 | 8.9798 | 39 |
| 0.0120 | 0.0793 | 10.4810 | 0.4963 | 0.0760 | 8.9194 | 40 |
| 0.0045 | 0.0795 | 10.2251 | 0.5014 | 0.0761 | 8.5737 | 41 |
| 0.0028 | 0.0795 | 10.3174 | 0.4968 | 0.0762 | 8.8525 | 42 |
| 0.0023 | 0.0795 | 10.4871 | 0.5027 | 0.0762 | 8.6712 | 43 |
| 0.0024 | 0.0795 | 10.3731 | 0.5055 | 0.0762 | 8.6347 | 44 |
| 0.0041 | 0.0795 | 10.2751 | 0.5242 | 0.0760 | 8.3671 | 45 |
| 0.0070 | 0.0794 | 10.2166 | 0.5169 | 0.0760 | 8.8409 | 46 |
| 0.0037 | 0.0795 | 10.0455 | 0.5174 | 0.0762 | 8.2514 | 47 |
| 0.0023 | 0.0795 | 9.9201 | 0.5167 | 0.0763 | 8.9537 | 48 |
| 0.0008 | 0.0795 | 10.0022 | 0.5166 | 0.0764 | 8.4855 | 49 |
| 0.0006 | 0.0795 | 9.9494 | 0.5233 | 0.0763 | 8.5719 | 50 |
| 0.0069 | 0.0794 | 10.2037 | 0.5434 | 0.0759 | 8.5399 | 51 |
| 0.0083 | 0.0794 | 9.9557 | 0.5173 | 0.0762 | 8.2406 | 52 |
| 0.0032 | 0.0795 | 10.0283 | 0.5240 | 0.0763 | 9.0101 | 53 |
| 0.0018 | 0.0795 | 10.0694 | 0.5247 | 0.0763 | 8.5717 | 54 |
| 0.0008 | 0.0795 | 10.1079 | 0.5217 | 0.0764 | 8.5608 | 55 |
| 0.0005 | 0.0795 | 10.0546 | 0.5286 | 0.0764 | 8.8830 | 56 |
| 0.0007 | 0.0795 | 10.2557 | 0.5328 | 0.0764 | 8.5665 | 57 |
| 0.0006 | 0.0795 | 10.2165 | 0.5412 | 0.0763 | 8.4623 | 58 |
| 0.0124 | 0.0792 | 10.2304 | 0.5284 | 0.0762 | 9.1194 | 59 |
| 0.0044 | 0.0795 | 10.3884 | 0.5223 | 0.0764 | 8.8152 | 60 |
| 0.0015 | 0.0795 | 9.8557 | 0.5227 | 0.0764 | 8.3774 | 61 |
| 0.0005 | 0.0795 | 9.8123 | 0.5233 | 0.0765 | 8.5043 | 62 |
| 0.0003 | 0.0795 | 9.7631 | 0.5282 | 0.0765 | 8.3860 | 63 |
| 0.0003 | 0.0795 | 9.7593 | 0.5320 | 0.0765 | 8.4815 | 64 |
| 0.0002 | 0.0795 | 9.7663 | 0.5357 | 0.0765 | 8.4281 | 65 |
| 0.0034 | 0.0795 | 9.8382 | 0.5771 | 0.0758 | 8.8051 | 66 |
| 0.0123 | 0.0792 | 10.2575 | 0.5261 | 0.0763 | 9.3701 | 67 |
| 0.0027 | 0.0795 | 10.3802 | 0.5272 | 0.0764 | 8.8216 | 68 |
| 0.0011 | 0.0795 | 10.1683 | 0.5291 | 0.0764 | 8.5736 | 69 |
| 0.0012 | 0.0795 | 10.1305 | 0.5336 | 0.0765 | 8.6648 | 70 |
| 0.0008 | 0.0795 | 10.2545 | 0.5315 | 0.0765 | 9.0617 | 71 |
| 0.0006 | 0.0795 | 10.4562 | 0.5369 | 0.0765 | 9.6485 | 72 |
| 0.0032 | 0.0795 | 10.2347 | 0.5569 | 0.0763 | 8.4947 | 73 |
| 0.0062 | 0.0794 | 10.1654 | 0.5471 | 0.0763 | 8.8666 | 74 |
| 0.0029 | 0.0795 | 10.1320 | 0.5376 | 0.0765 | 8.7713 | 75 |
| 0.0012 | 0.0795 | 10.2943 | 0.5406 | 0.0765 | 8.6959 | 76 |
| 0.0006 | 0.0795 | 10.1888 | 0.5371 | 0.0767 | 8.9689 | 77 |
| 0.0005 | 0.0795 | 10.2138 | 0.5398 | 0.0766 | 8.7470 | 78 |
| 0.0016 | 0.0795 | 10.2173 | 0.5497 | 0.0764 | 8.9675 | 79 |
| 0.0065 | 0.0794 | 10.2806 | 0.5559 | 0.0763 | 9.4487 | 80 |
| 0.0028 | 0.0795 | 10.7728 | 0.5394 | 0.0766 | 8.9716 | 81 |
| 0.0012 | 0.0795 | 10.3247 | 0.5453 | 0.0765 | 8.9986 | 82 |
| 0.0013 | 0.0795 | 10.3174 | 0.5535 | 0.0765 | 8.9229 | 83 |
| 0.0011 | 0.0795 | 10.2846 | 0.5452 | 0.0766 | 9.1239 | 84 |
| 0.0007 | 0.0795 | 10.1996 | 0.5491 | 0.0766 | 8.9308 | 85 |
| 0.0034 | 0.0795 | 10.5048 | 0.5578 | 0.0764 | 8.9920 | 86 |
| 0.0038 | 0.0795 | 10.1430 | 0.5538 | 0.0765 | 9.1635 | 87 |
| 0.0019 | 0.0795 | 10.3176 | 0.5492 | 0.0766 | 8.5812 | 88 |
| 0.0007 | 0.0795 | 10.2569 | 0.5488 | 0.0766 | 8.9133 | 89 |
| 0.0006 | 0.0795 | 10.2538 | 0.5541 | 0.0766 | 8.7676 | 90 |
| 0.0029 | 0.0795 | 10.1412 | 0.5666 | 0.0764 | 9.0822 | 91 |
| 0.0042 | 0.0795 | 9.5603 | 0.5582 | 0.0765 | 7.6837 | 92 |
| 0.0015 | 0.0795 | 9.4004 | 0.5495 | 0.0766 | 7.7859 | 93 |
| 0.0008 | 0.0795 | 9.5417 | 0.5503 | 0.0767 | 7.8876 | 94 |
| 0.0005 | 0.0795 | 9.3473 | 0.5590 | 0.0766 | 7.8967 | 95 |
| 0.0016 | 0.0795 | 9.1740 | 0.5746 | 0.0765 | 7.8469 | 96 |
| 0.0044 | 0.0794 | 8.8948 | 0.5589 | 0.0765 | 7.4085 | 97 |
| 0.0022 | 0.0795 | 8.7907 | 0.5556 | 0.0766 | 7.7851 | 98 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e0_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:24:15Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:24:12Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
Subsets and Splits
Filtered Qwen2.5 Distill Models
Identifies specific configurations of models by filtering cards that contain 'distill', 'qwen2.5', '7b' while excluding certain base models and incorrect model ID patterns, uncovering unique model variants.
Filtered Model Cards Count
Finds the count of entries with specific card details that include 'distill', 'qwen2.5', '7b' but exclude certain base models, revealing valuable insights about the dataset's content distribution.
Filtered Distill Qwen 7B Models
Filters for specific card entries containing 'distill', 'qwen', and '7b', excluding certain strings and patterns, to identify relevant model configurations.
Filtered Qwen-7b Model Cards
The query performs a detailed filtering based on specific keywords and excludes certain entries, which could be useful for identifying a specific subset of cards but does not provide deeper insights or trends.
Filtered Qwen 7B Model Cards
The query filters for specific terms related to "distilled" or "distill", "qwen", and "7b" in the 'card' column but excludes certain base models, providing a limited set of entries for further inspection.
Qwen 7B Distilled Models
The query provides a basic filtering of records to find specific card names that include keywords related to distilled Qwen 7b models, excluding a particular base model, which gives limited insight but helps in focusing on relevant entries.
Qwen 7B Distilled Model Cards
The query filters data based on specific keywords in the modelId and card fields, providing limited insight primarily useful for locating specific entries rather than revealing broad patterns or trends.
Qwen 7B Distilled Models
Finds all entries containing the terms 'distilled', 'qwen', and '7b' in a case-insensitive manner, providing a filtered set of records but without deeper analysis.
Distilled Qwen 7B Models
The query filters for specific model IDs containing 'distilled', 'qwen', and '7b', providing a basic retrieval of relevant entries but without deeper analysis or insight.
Filtered Model Cards with Distill Qwen2.
Filters and retrieves records containing specific keywords in the card description while excluding certain phrases, providing a basic count of relevant entries.
Filtered Model Cards with Distill Qwen 7
The query filters specific variations of card descriptions containing 'distill', 'qwen', and '7b' while excluding a particular base model, providing limited but specific data retrieval.
Distill Qwen 7B Model Cards
The query filters and retrieves rows where the 'card' column contains specific keywords ('distill', 'qwen', and '7b'), providing a basic filter result that can help in identifying specific entries.