Merge pull request #30 from LightricksResearch/fix-no-flash-attention 05cb3e4 unverified Sapir Weissbuch commited on 28 days ago
model: fix flash attention enabling - do not check device type at this point (can be CPU) 5940103 erichardson commited on 28 days ago
Feature: Add mixed precision support and direct bfloat16 support. 1940326 daniel shalem commited on Oct 31
transformer3d: init mode xora never happens because lower case needed. a3498bb dudumoshe commited on Oct 8