File size: 669 Bytes
58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd 9037fe1 58f84dd |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 |
---
base_model: ProTrekHub/Protein_Encoder_35M
library_name: peft
---
# Model Card for Model-demo-35M
This model is used for a demo classification task
## Task type
Protein-level Classification
## Model input type
SA Sequence
## Label meanings
0: A
1: B
## LoRA config
- **r:** 8
- **lora_dropout:** 0.0
- **lora_alpha:** 16
- **target_modules:** ['output.dense', 'intermediate.dense', 'key', 'query', 'value']
- **modules_to_save:** ['classifier']
## Training config
- **optimizer:**
- **class:** AdamW
- **betas:** (0.9, 0.98)
- **weight_decay:** 0.01
- **learning rate:** 0.001
- **epoch:** 1
- **batch size:** 64
- **precision:** 16-mixed
|