怎么从本地加载模型呢?
#37
by
yizhiezi
- opened
- 通过from_pretrained()将模型保存
tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan-13B-Chat", use_fast=False, trust_remote_code=True, cache_dir="/mylocal")
model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan-13B-Chat", device_map="auto", torch_dtype=torch.float16, trust_remote_code=True, cache_dir="/mylocal")
model.generation_config = GenerationConfig.from_pretrained("baichuan-inc/Baichuan-13B-Chat", cache_dir="/mylocal") - 通过from_pretrained("/mylocal")加载会报错
我也想知道
直接把baichuan-inc/Baichuan-13B-Chat替换成本地的路径应该就可以了