File size: 182 Bytes
a446418
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
gradio
open_clip-torch
torch  # <--- Ensure this line is present
datasets
torchvision
Pillow
numpy
transformers # (if you are using any transformer models directly besides open_clip)