Update README.md
Browse files
README.md
CHANGED
@@ -44,14 +44,14 @@ _CC-By-NC-SA-4.0_ (non-commercial use only)
|
|
44 |
>I want you to come up with a tweet based on this summary of the article:
|
45 |
>"Introducing MPT-7B, the latest entry in our MosaicML Foundation Series.
|
46 |
>MPT-7B is a transformer trained from scratch on IT tokens of text and code.
|
47 |
-
>It is open source
|
48 |
>MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k.
|
49 |
>Starting today, you can train, finetune, and deploy your own private MPT models, either starting from one of our checkpoints or training from scratch.
|
50 |
>For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-Story Writer-65k+, the last of which uses a context length of 65k tokens!
|
51 |
|
52 |
**MPT-7B-Instruct**:
|
53 |
>Here's my suggestion for a tweet:
|
54 |
-
>"Exciting news! @MosaicML has launched #MPT_7B - a new standard for open-source
|
55 |
|
56 |
**SamIAm85**:
|
57 |
>Is there anything you want to say to MosaicML about this exciting news?
|
@@ -146,6 +146,11 @@ While great efforts have been taken to clean the pretraining data, it is possibl
|
|
146 |
|
147 |
This model was finetuned by Sam Havens and the MosaicML NLP team
|
148 |
|
|
|
|
|
|
|
|
|
|
|
149 |
## MosaicML Platform
|
150 |
|
151 |
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-7b).
|
@@ -158,7 +163,8 @@ Please cite this model using the following format:
|
|
158 |
```
|
159 |
@online{MosaicML2023Introducing,
|
160 |
author = {MosaicML NLP Team},
|
161 |
-
title = {Introducing MPT-7B: A New Standard for Open-Source,
|
|
|
162 |
year = {2023},
|
163 |
url = {www.mosaicml.com/blog/mpt-7b},
|
164 |
note = {Accessed: 2023-03-28}, % change this date
|
|
|
44 |
>I want you to come up with a tweet based on this summary of the article:
|
45 |
>"Introducing MPT-7B, the latest entry in our MosaicML Foundation Series.
|
46 |
>MPT-7B is a transformer trained from scratch on IT tokens of text and code.
|
47 |
+
>It is open source and it matches the quality of LLaMA-7B.
|
48 |
>MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k.
|
49 |
>Starting today, you can train, finetune, and deploy your own private MPT models, either starting from one of our checkpoints or training from scratch.
|
50 |
>For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-Story Writer-65k+, the last of which uses a context length of 65k tokens!
|
51 |
|
52 |
**MPT-7B-Instruct**:
|
53 |
>Here's my suggestion for a tweet:
|
54 |
+
>"Exciting news! @MosaicML has launched #MPT_7B - a new standard for open-source LLMs. With training on 17 tokens of text & code, it matches the quality of LLaMA-7B. Now you can train, fine-tune, and deploy your own private MPT models via their platform. Check out the full blog post here: https://bit.ly/3j6UJzZ"
|
55 |
|
56 |
**SamIAm85**:
|
57 |
>Is there anything you want to say to MosaicML about this exciting news?
|
|
|
146 |
|
147 |
This model was finetuned by Sam Havens and the MosaicML NLP team
|
148 |
|
149 |
+
## Disclaimer
|
150 |
+
|
151 |
+
The license on this model does not constitute legal advice. We are not responsible for the actions of third parties who use this model. Please cosult an attorney before using this model for commercial purposes.
|
152 |
+
|
153 |
+
|
154 |
## MosaicML Platform
|
155 |
|
156 |
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-7b).
|
|
|
163 |
```
|
164 |
@online{MosaicML2023Introducing,
|
165 |
author = {MosaicML NLP Team},
|
166 |
+
title = {Introducing MPT-7B: A New Standard for Open-Source,
|
167 |
+
ly Usable LLMs},
|
168 |
year = {2023},
|
169 |
url = {www.mosaicml.com/blog/mpt-7b},
|
170 |
note = {Accessed: 2023-03-28}, % change this date
|