UGround / llava /train /train_mem.py
BoyuNLP's picture
init
3bbba47
raw
history blame contribute delete
115 Bytes
from llava.train.train import train
if __name__ == "__main__":
train(attn_implementation="flash_attention_2")