ArturBaranowskiAA commited on
Commit
dd3ad97
1 Parent(s): 6fdac15

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +19 -0
README.md CHANGED
@@ -6,4 +6,23 @@ library_name: transformers
6
  pipeline_tag: text-generation
7
  ---
8
 
 
9
  We provide a joint model card for `Pharia-1-LLM-7B-control` and `Pharia-1-LLM-control-aligned`. Find this model card [here](https://huggingface.co/Aleph-Alpha/Pharia-1-LLM-7B-control).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6
  pipeline_tag: text-generation
7
  ---
8
 
9
+ This is the safetensors-conversion of `Pharia-1-LLM-7B-control-aligned`.
10
  We provide a joint model card for `Pharia-1-LLM-7B-control` and `Pharia-1-LLM-control-aligned`. Find this model card [here](https://huggingface.co/Aleph-Alpha/Pharia-1-LLM-7B-control).
11
+
12
+ # Usage
13
+
14
+ ```python
15
+ from transformers import AutoModelForCausalLM, PreTrainedTokenizerFast
16
+
17
+ INPUT = "Hello, how are you"
18
+ MODEL_ID = "Aleph-Alpha/Pharia-1-LLM-7B-control-aligned-safetensors"
19
+
20
+ tokenizer = PreTrainedTokenizerFast.from_pretrained(MODEL_ID)
21
+ model = AutoModelForCausalLM.from_pretrained(MODEL_ID, trust_remote_code=True)
22
+
23
+ inputs = tokenizer(INPUT, return_token_type_ids=False, return_tensors="pt")
24
+ outputs = model.generate(**inputs, max_new_tokens=50)
25
+
26
+ generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
27
+ print(generated_text)
28
+ ```