restylize Southeast Asia
Browse files
README.md
CHANGED
@@ -3,11 +3,11 @@ license: mit
|
|
3 |
---
|
4 |
# SEA-LION
|
5 |
|
6 |
-
SEA-LION is a collection of LLMs which has been pretrained and instruct-tuned for the
|
7 |
The models range from 3 billion to 7 billion parameters.
|
8 |
This is the card for the SEA-LION 7B model.
|
9 |
|
10 |
-
SEA-LION stands for <i>
|
11 |
|
12 |
|
13 |
## Model Details
|
@@ -15,7 +15,7 @@ SEA-LION stands for <i>South-East Asia Languages In One Network</i>.
|
|
15 |
### Model Description
|
16 |
|
17 |
The SEA-LION model is a significant leap forward in the field of natural language processing and understanding,
|
18 |
-
specifically trained to understand
|
19 |
|
20 |
SEA-LION is built on the robust MPT architecture and utilize a vocabulary size of 256K.
|
21 |
|
|
|
3 |
---
|
4 |
# SEA-LION
|
5 |
|
6 |
+
SEA-LION is a collection of LLMs which has been pretrained and instruct-tuned for the Southeast Asia (SEA) region.
|
7 |
The models range from 3 billion to 7 billion parameters.
|
8 |
This is the card for the SEA-LION 7B model.
|
9 |
|
10 |
+
SEA-LION stands for <i>Southeast Asia Languages In One Network</i>.
|
11 |
|
12 |
|
13 |
## Model Details
|
|
|
15 |
### Model Description
|
16 |
|
17 |
The SEA-LION model is a significant leap forward in the field of natural language processing and understanding,
|
18 |
+
specifically trained to understand Southeast Asia (SEA) regional context.
|
19 |
|
20 |
SEA-LION is built on the robust MPT architecture and utilize a vocabulary size of 256K.
|
21 |
|