medmekk HF staff commited on
Commit
2465824
1 Parent(s): 05f37de

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -2
README.md CHANGED
@@ -59,9 +59,19 @@ Users (both direct and downstream) should be made aware of the risks, biases and
59
 
60
  ## How to Get Started with the Model
61
 
62
- Use the code below to get started with the model.
 
 
 
 
 
 
 
 
 
 
 
63
 
64
- [More Information Needed]
65
 
66
  ## Training Details
67
 
 
59
 
60
  ## How to Get Started with the Model
61
 
62
+ You can easily load and test our model in Transformers. Just follow the code below:
63
+
64
+ ```python
65
+ model = AutoModelForCausalLM.from_pretrained("HF1BitLLM/Llama3-8B-1.58-100B-tokens", device_map="cuda", torch_dtype=torch.bfloat16)
66
+ tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3-8B-Instruct")
67
+
68
+ input_text = "Daniel went back to the the the garden. Mary travelled to the kitchen. Sandra journeyed to the kitchen. Sandra went to the hallway. John went to the bedroom. Mary went back to the garden. Where is Mary?\nAnswer:"
69
+
70
+ input_ids = tokenizer.encode(input_text, return_tensors="pt").cuda()
71
+ output = model.generate(input_ids, max_length=10, do_sample=False)
72
+ generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
73
+ print(generated_text)
74
 
 
75
 
76
  ## Training Details
77