varma007ut
commited on
Commit
•
5b7116a
1
Parent(s):
d0966dd
Update README.md
Browse files
README.md
CHANGED
@@ -35,3 +35,24 @@ To use this model, ensure you have the necessary libraries installed. You can in
|
|
35 |
|
36 |
```bash
|
37 |
pip install transformers
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
35 |
|
36 |
```bash
|
37 |
pip install transformers
|
38 |
+
## Usage
|
39 |
+
|
40 |
+
Here’s an example of how to load and use the model for text generation:
|
41 |
+
|
42 |
+
```python
|
43 |
+
from transformers import AutoModelForCausalLM, AutoTokenizer
|
44 |
+
|
45 |
+
model_name = "your_model_name" # Replace with your model's name
|
46 |
+
|
47 |
+
# Load model and tokenizer
|
48 |
+
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
49 |
+
model = AutoModelForCausalLM.from_pretrained(model_name)
|
50 |
+
|
51 |
+
# Generate text
|
52 |
+
input_text = "What are the symptoms of diabetes?"
|
53 |
+
input_ids = tokenizer.encode(input_text, return_tensors='pt')
|
54 |
+
|
55 |
+
output = model.generate(input_ids, max_length=150)
|
56 |
+
generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
|
57 |
+
|
58 |
+
print(generated_text)
|