Spaces:
mashroo
/
Running on Zero

CRM / imagedream /ldm /modules /attention.py

Commit History

Refactor attention module to improve xformers integration. Renamed availability flag to HAS_XFORMERS and added safe_memory_efficient_attention function for better handling of attention operations across devices. Updated related assertions and calls to ensure compatibility with systems lacking GPU support.
1d3fed2

YoussefAnso commited on

init
f4e8cf6

Zhengyi commited on