Spaces:
mashroo
/
Running on Zero

CRM / imagedream /ldm /modules /diffusionmodules
YoussefAnso's picture
Refactor attention module to improve xformers integration. Renamed availability flag to HAS_XFORMERS and added safe_memory_efficient_attention function for better handling of attention operations across devices. Updated related assertions and calls to ensure compatibility with systems lacking GPU support.
1d3fed2