MohamedRashad commited on
Commit
cb5f0c4
·
1 Parent(s): 0d5050b

Add accelerate and flash attention wheel to requirements

Browse files
Files changed (1) hide show
  1. requirements.txt +2 -0
requirements.txt CHANGED
@@ -1,3 +1,5 @@
1
  transformers
2
  torch
 
 
3
  spaces
 
1
  transformers
2
  torch
3
+ accelerate
4
+ https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.0.8/flash_attn-2.7.4.post1+cu126torch2.7-cp310-cp310-linux_x86_64.whl
5
  spaces