Update README.md
Browse files
README.md
CHANGED
@@ -63,6 +63,7 @@ It follows only **ChatML** format.
|
|
63 |
```
|
64 |
|
65 |
#### Example code
|
|
|
66 |
Since, chat_template already contains insturction format above.
|
67 |
You can use the code below.
|
68 |
```python
|
@@ -71,7 +72,8 @@ device = "cuda" # the device to load the model onto
|
|
71 |
model = AutoModelForCausalLM.from_pretrained("kuotient/Seagull-13B-translation")
|
72 |
tokenizer = AutoTokenizer.from_pretrained("kuotient/Seagull-13B-translation")
|
73 |
messages = [
|
74 |
-
{"role": "
|
|
|
75 |
]
|
76 |
encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")
|
77 |
|
|
|
63 |
```
|
64 |
|
65 |
#### Example code
|
66 |
+
**I highly recommend to inference model with vllm. I will write a guide for quick and easy inference if requested.**
|
67 |
Since, chat_template already contains insturction format above.
|
68 |
You can use the code below.
|
69 |
```python
|
|
|
72 |
model = AutoModelForCausalLM.from_pretrained("kuotient/Seagull-13B-translation")
|
73 |
tokenizer = AutoTokenizer.from_pretrained("kuotient/Seagull-13B-translation")
|
74 |
messages = [
|
75 |
+
{"role": "system", "content", "์ฃผ์ด์ง ๋ฌธ์ฅ์ ํ๊ตญ์ด๋ก ๋ฒ์ญํ์ธ์."}
|
76 |
+
{"role": "user", "content": "Here are five examples of nutritious foods to serve your kids."},
|
77 |
]
|
78 |
encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")
|
79 |
|