--- license: apache-2.0 --- ## Example: ``` from transformers import AutoModelForCausalLM, AutoTokenizer, GenerationConfig tokenizer = AutoTokenizer.from_pretrained("Qwen-1_8B-m4-LDJnr-combined", trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained("Qwen-1_8B-m4-LDJnr-combined", device_map="auto", trust_remote_code=True).eval() response, _ = model.chat(tokenizer, "What kind of a noise annoys a noisy oyster?", history=None) print(response) ```