Make flash attention configurable in user code
#26
by
YenChunChen
- opened
With this PR, users can specify whether to enable flash attention 2 in from_pretrain
.
YenChunChen
changed pull request status to
open
@YenChunChen default should be flash_attention in readme, user can specify to use eager if they want
@haipingwu updated default to flash attention
hi @YenChunChen , please reset config.json to original as well
done
leoxiaobin
changed pull request status to
merged