comfyanonymous 8248babd44 Use pytorch attention by default on nvidia when xformers isn't present.
Add a new argument --use-quad-cross-attention
2023-06-26 13:03:44 -04:00
..
2023-06-26 00:48:48 -04:00
2023-01-16 22:37:14 -05:00
2023-01-16 22:37:14 -05:00