HowJanusSeesItself / requirements.txt
thomasgauthier's picture
put flash attention in requirements instead
ea9e4b8
raw
history blame contribute delete
251 Bytes
torch
numpy
Pillow
gradio
janus @ git+https://github.com/deepseek-ai/Janus
transformers
spaces
flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl