File size: 319 Bytes
001a0bd
 
 
 
 
 
 
 
4ce5e3d
09b9578
d0bac5c
0e525c8
001a0bd
 
2af9891
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
transformers
yt-dlp
openai
torch
torchaudio
scipy

gradio        
python-dotenv
pdfplumber
websockets
streamlit
moviepy
docling
ffmpeg  

torchvision
accelerate
spaces
https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.7-cp310-cp310-linux_x86_64.whl