Current version of the JupyterLab causes issues with NVCC, in which when you try to install the flash attention it will never complete. With this change you can utilize the CUDA to its fullest.

This seems useful!

@denizaybey I found a few libraries don't yet claim to support 12.6 so I've merged a PR to bump to 12.5 for now https://huggingface.co/spaces/SpacesExamples/jupyterlab/discussions/19#66f3c95d9aa075976ab89470. Is that sufficient for what you need. I seemed to have no issues using flash attention with this version.

@davanstrien , It is fully sufficient, thank you very much!

denizaybey changed pull request status to closed

Sign up or log in to comment