File size: 2,040 Bytes
09e6e98
 
80ec937
 
 
09e6e98
 
 
 
 
a092fd7
8640a78
 
 
 
09e6e98
 
80ec937
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
09e6e98
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
---
title: Transeption IGEM BASISCHINA 2025
emoji: 🧬
colorFrom: blue
colorTo: green
sdk: gradio
sdk_version: 5.34.2
app_file: app.py
pinned: false
license: mit
suggested_hardware: zero-a10g
models:
  - PascalNotin/Tranception_Small
  - PascalNotin/Tranception_Medium
  - PascalNotin/Tranception_Large
---

# Tranception Protein Fitness Prediction - BASIS-China iGEM 2025

Welcome to BASIS-China iGEM Team's deployment of Tranception on Hugging Face Spaces!

## About This Project

This is an implementation of the Tranception model for protein fitness prediction, deployed by the BASIS-China iGEM Team 2025. Our goal is to make advanced protein engineering tools accessible to the synthetic biology community.

### Features
- **In silico directed evolution**: Iteratively improve protein fitness through single amino acid substitutions
- **Comprehensive fitness analysis**: Generate heatmaps showing fitness scores for all possible mutations
- **Zero GPU support**: Leverages Hugging Face's dynamic GPU allocation for efficient inference
- **Multiple model sizes**: Choose between Small, Medium, and Large models based on your needs

### Technical Implementation
This deployment utilizes Hugging Face's Zero GPU infrastructure, which:
- Dynamically allocates H200 GPU resources when available
- Seamlessly falls back to CPU processing when GPUs are unavailable
- Ensures efficient resource management for all users

## About BASIS-China iGEM Team

We are a high school synthetic biology team participating in the International Genetically Engineered Machine (iGEM) competition. Our 2025 project focuses on protein engineering and computational biology applications.

## Credits

This implementation is based on:
**Tranception: Protein Fitness Prediction with Autoregressive Transformers and Inference-time Retrieval**
by Pascal Notin, Mafalda Dias, Jonathan Frazer, Javier Marchena-Hurtado, Aidan N. Gomez, Debora S. Marks, and Yarin Gal.

Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference