OSError occurs while using from_pretrained
#1
by
lkk160042
- opened
When I try the code as recommended,
model = AutoModelForCausalLM.from_pretrained(
model_id,
device_map="auto",
quantization_config=quantization_config,
torch_dtype=torch.float16,
)
this error occurs:
OSError: brildev7/gemma-7b-it-finetune-summarization-ko-sft-qlora does not appear to have a file named config.json. Checkout 'https://huggingface.co/brildev7/gemma-7b-it-finetune-summarization-ko-sft-qlora/main' for available files.
I'm not sure, but I think it was fixed after updating transformers to version 4.38.0