Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference
atrott sam-mosaic commited on
Commit
811e396
1 Parent(s): 6825a30

add community resources to README (#25)

Browse files

- add community resources to README (5588d3d931430d6d04a65005b6e61a39a16e694a)


Co-authored-by: Sam <[email protected]>

Files changed (1) hide show
  1. README.md +7 -0
README.md CHANGED
@@ -89,6 +89,13 @@ from transformers import AutoTokenizer
89
  tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")
90
  ```
91
 
 
 
 
 
 
 
 
92
  ## Example Epilogue
93
 
94
  The full text of the _The Great Gatsby_ (67873 tokens) was fed to the model, followed by the text "EPILOGUE"
 
89
  tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")
90
  ```
91
 
92
+ ## Community-Created Resources
93
+
94
+ These were not created by MosaicML, but you may find them useful. These links are not an endorsement of the creators or their content.
95
+
96
+ - [Oobabooga Running MPT-7B-Storywriter](https://youtu.be/QVVb6Md6huA)
97
+ - [NEW MPT-7B-StoryWriter CRUSHES GPT-4!](https://www.youtube.com/watch?v=O9Y_ZdsuKWQ&t=649s) - Has a long section on running locally using Oobabooga
98
+
99
  ## Example Epilogue
100
 
101
  The full text of the _The Great Gatsby_ (67873 tokens) was fed to the model, followed by the text "EPILOGUE"