Text Generation
Transformers
Inference Endpoints
dimalik commited on
Commit
e84d086
β€’
1 Parent(s): 4e1fe68

add more examples

Browse files
Files changed (1) hide show
  1. README.md +11 -1
README.md CHANGED
@@ -92,11 +92,21 @@ tokenizer = AutoTokenizer.from_pretrained(model_id)
92
 
93
  model = AutoModelForCausalLM.from_pretrained(model_id)
94
 
95
- prompt = '### 命什:\nζ–‡η« γ‚’ζ–‡ζ³•ηš„γ«γ™γ‚‹\n### ε…₯εŠ›:\nDear Sir ,\n### ε‡ΊεŠ›:\n\n'
 
96
 
97
  inputs = tokenizer(prompt, return_tensors='pt')
98
 
99
  outputs = model.generate(**inputs, max_new_tokens=20)
100
 
101
  print(tokenizer.decode(outputs[0], skip_special_tokens=True)
 
 
 
 
 
 
 
 
 
102
  ```
 
92
 
93
  model = AutoModelForCausalLM.from_pretrained(model_id)
94
 
95
+ # English GEC
96
+ prompt = '### 命什:\nζ–‡η« γ‚’ζ–‡ζ³•ηš„γ«γ™γ‚‹\n### ε…₯εŠ›:\nI has small cat ,\n### ε‡ΊεŠ›:\n\n'
97
 
98
  inputs = tokenizer(prompt, return_tensors='pt')
99
 
100
  outputs = model.generate(**inputs, max_new_tokens=20)
101
 
102
  print(tokenizer.decode(outputs[0], skip_special_tokens=True)
103
+
104
+ # --> I have a small cat ,
105
+
106
+ # German GEC
107
+
108
+ prompt = '### 命什:\nζ–‡η« γ‚’ζ–‡ζ³•ηš„γ«γ™γ‚‹\n### ε…₯εŠ›:\nIch haben eines kleines Katze ,\n### ε‡ΊεŠ›:\n\n'
109
+
110
+ # ...
111
+ # --> Ich habe eine kleine Katze ,
112
  ```