Spaces:
Running
on
Zero
Running
on
Zero

Update README and app.py to enhance project description and technical details for BASIS-China iGEM 2025 deployment of Tranception
80ec937
title: Transeption IGEM BASISCHINA 2025 | |
emoji: 🧬 | |
colorFrom: blue | |
colorTo: green | |
sdk: gradio | |
sdk_version: 5.34.2 | |
app_file: app.py | |
pinned: false | |
license: mit | |
suggested_hardware: zero-a10g | |
models: | |
- PascalNotin/Tranception_Small | |
- PascalNotin/Tranception_Medium | |
- PascalNotin/Tranception_Large | |
# Tranception Protein Fitness Prediction - BASIS-China iGEM 2025 | |
Welcome to BASIS-China iGEM Team's deployment of Tranception on Hugging Face Spaces! | |
## About This Project | |
This is an implementation of the Tranception model for protein fitness prediction, deployed by the BASIS-China iGEM Team 2025. Our goal is to make advanced protein engineering tools accessible to the synthetic biology community. | |
### Features | |
- **In silico directed evolution**: Iteratively improve protein fitness through single amino acid substitutions | |
- **Comprehensive fitness analysis**: Generate heatmaps showing fitness scores for all possible mutations | |
- **Zero GPU support**: Leverages Hugging Face's dynamic GPU allocation for efficient inference | |
- **Multiple model sizes**: Choose between Small, Medium, and Large models based on your needs | |
### Technical Implementation | |
This deployment utilizes Hugging Face's Zero GPU infrastructure, which: | |
- Dynamically allocates H200 GPU resources when available | |
- Seamlessly falls back to CPU processing when GPUs are unavailable | |
- Ensures efficient resource management for all users | |
## About BASIS-China iGEM Team | |
We are a high school synthetic biology team participating in the International Genetically Engineered Machine (iGEM) competition. Our 2025 project focuses on protein engineering and computational biology applications. | |
## Credits | |
This implementation is based on: | |
**Tranception: Protein Fitness Prediction with Autoregressive Transformers and Inference-time Retrieval** | |
by Pascal Notin, Mafalda Dias, Jonathan Frazer, Javier Marchena-Hurtado, Aidan N. Gomez, Debora S. Marks, and Yarin Gal. | |
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference | |