Phi-3-small-128k-instruct / positional_embedding.py

Commit History

Fix for RuntimeError: FlashAttention only support fp16 and bf16 data type during fine tuning.
5b7216f
verified

moidhassan commited on