readme: change Ontocord to Ontocord.ai
Browse files
README.md
CHANGED
@@ -14,7 +14,7 @@ language:
|
|
14 |
GPT-NeoXT-Chat-Base-20B is based on ElutherAI’s GPT-NeoX model, and is fine-tuned with data focusing on dialog-style interactions.
|
15 |
We focused the tuning on several tasks such as question answering, classification, extraction, and summarization.
|
16 |
We’ve fine-tuned the model with a collection of 43 million high-quality instructions.
|
17 |
-
Together partnered with LAION and Ontocord, who both helped curate the dataset the model is based on.
|
18 |
You can read more about this process and the availability of this dataset in LAION’s blog post [here](https://laion.ai/blog/oig-dataset/).
|
19 |
|
20 |
## Model Details
|
@@ -187,4 +187,4 @@ Please refer to [togethercomputer/OpenDataHub](https://github.com/togethercomput
|
|
187 |
- **Batch:** 2 x 2 x 64 x 2048 = 524288 tokens
|
188 |
- **Learning rate:** warmup to 1e-6 for 100 steps and then kept constant
|
189 |
|
190 |
-
|
|
|
14 |
GPT-NeoXT-Chat-Base-20B is based on ElutherAI’s GPT-NeoX model, and is fine-tuned with data focusing on dialog-style interactions.
|
15 |
We focused the tuning on several tasks such as question answering, classification, extraction, and summarization.
|
16 |
We’ve fine-tuned the model with a collection of 43 million high-quality instructions.
|
17 |
+
Together partnered with LAION and Ontocord.ai, who both helped curate the dataset the model is based on.
|
18 |
You can read more about this process and the availability of this dataset in LAION’s blog post [here](https://laion.ai/blog/oig-dataset/).
|
19 |
|
20 |
## Model Details
|
|
|
187 |
- **Batch:** 2 x 2 x 64 x 2048 = 524288 tokens
|
188 |
- **Learning rate:** warmup to 1e-6 for 100 steps and then kept constant
|
189 |
|
190 |
+
|