yichenchenchen commited on
Commit
3562df8
·
verified ·
1 Parent(s): d17789b

Update requirements.txt

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -6,4 +6,4 @@ transformers==4.55.0
6
  timm==0.9.12
7
  diffusers==0.34.0
8
  autoawq==0.2.9
9
- ./flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
 
6
  timm==0.9.12
7
  diffusers==0.34.0
8
  autoawq==0.2.9
9
+ flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/tag/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu121torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl