minor prose tweaks
Browse files
README.md
CHANGED
@@ -91,7 +91,7 @@ model = transformers.AutoModelForCausalLM.from_pretrained(
|
|
91 |
)
|
92 |
```
|
93 |
|
94 |
-
The model was trained
|
95 |
|
96 |
```python
|
97 |
import transformers
|
|
|
91 |
)
|
92 |
```
|
93 |
|
94 |
+
The model was trained initially on a sequence length of 2048. An additional pre-training phase was included for sequence length adaptation to 8192. However, ALiBi further enables users to increase the maximum sequence length during finetuning and/or inference. For example:
|
95 |
|
96 |
```python
|
97 |
import transformers
|