jacobfulano
commited on
Commit
•
40e5047
1
Parent(s):
47642d6
Update README.md
Browse files
README.md
CHANGED
@@ -28,7 +28,7 @@ Apache-2.0 (commercial use permitted)
|
|
28 |
|
29 |
* [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b)
|
30 |
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
|
31 |
-
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://join.slack.com/t/mosaicml-community/shared_invite/zt-
|
32 |
|
33 |
|
34 |
## How to Use
|
@@ -135,6 +135,11 @@ While great efforts have been taken to clean the pretraining data, it is possibl
|
|
135 |
|
136 |
This model was finetuned by Alex Trott and the MosaicML NLP team
|
137 |
|
|
|
|
|
|
|
|
|
|
|
138 |
## Citation
|
139 |
|
140 |
Please cite this model using the following format:
|
|
|
28 |
|
29 |
* [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b)
|
30 |
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
|
31 |
+
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://join.slack.com/t/mosaicml-community/shared_invite/zt-1btms90mc-GipE2ufuPkKY0QBrmF3LSA)!
|
32 |
|
33 |
|
34 |
## How to Use
|
|
|
135 |
|
136 |
This model was finetuned by Alex Trott and the MosaicML NLP team
|
137 |
|
138 |
+
## MosaicML Platform
|
139 |
+
|
140 |
+
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo).
|
141 |
+
|
142 |
+
|
143 |
## Citation
|
144 |
|
145 |
Please cite this model using the following format:
|