incident-ml-inference / requirements.txt
brijeshpandya's picture
HF Space: Docker FastAPI inference
2fb4c2e
raw
history blame contribute delete
99 Bytes
fastapi
uvicorn[standard]
transformers
torch
sentencepiece>=0.1.99
tokenizers>=0.15
protobuf>=4.25