grammar/claritity adjustment to readme
Browse files
README.md
CHANGED
@@ -28,9 +28,9 @@ In the paper this is their abstract
|
|
28 |
|
29 |
> We hope that TinyStories can facilitate the development, analysis and research of LMs, especially for low-resource or specialized domains, and shed light on the emergence of language capabilities in LMs.
|
30 |
|
31 |
-
Maykeye's replication effort
|
32 |
|
33 |
-
Anyway, this conversion to llamafile should give you an easy way to give this model a shot and also of the whole llamafile ecosystem in general (as it's quite quite small compared to other larger chat capable models). As
|
34 |
|
35 |
## Usage In Linux
|
36 |
|
|
|
28 |
|
29 |
> We hope that TinyStories can facilitate the development, analysis and research of LMs, especially for low-resource or specialized domains, and shed light on the emergence of language capabilities in LMs.
|
30 |
|
31 |
+
While Maykeye's replication effort didn't reduce the model down to 1M parameters, Maykeye did get down to 5M parameters which is still quite an achievement as far as known replication efforts have shown.
|
32 |
|
33 |
+
Anyway, this conversion to llamafile should give you an easy way to give this model a shot and also of the whole llamafile ecosystem in general (as it's quite quite small compared to other larger chat capable models). As this is primarily a text generation model, it will open a web server as part of the llamafile process, but it will not engage in chat as one might expect.. Instead you would give it a story prompt and it will generate a story for you. Don't expect any great stories for this size however, but it's an interesting demo on how small you can squeeze AI models and still have it generate recognisable english.
|
34 |
|
35 |
## Usage In Linux
|
36 |
|