File size: 191 Bytes
5831b00
 
 
f3fa7b4
1
2
3
4
5
numpy==1.24.3
torch==2.2.0
transformers==4.44.2
https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl