Update README.md
Browse filesAdded our "open source declarative image" All work done on Debian GNU/Linux #!
README.md
CHANGED
@@ -9,6 +9,11 @@ datasets:
|
|
9 |
- EleutherAI/pile
|
10 |
---
|
11 |
|
|
|
|
|
|
|
|
|
|
|
12 |
GPT-NeoX-20B is a 20 billion parameter autoregressive language model trained
|
13 |
on [the Pile](https://pile.eleuther.ai/) using the [GPT-NeoX
|
14 |
library](https://github.com/EleutherAI/gpt-neox). Its architecture intentionally
|
|
|
9 |
- EleutherAI/pile
|
10 |
---
|
11 |
|
12 |
+
<h1 style='text-align: center '>GPT-NeoX-20b LLM</h1>
|
13 |
+
<h2 style='text-align: center '><em>Fork of EleutherAI/gpt-neox-20b</em> </h2>
|
14 |
+
<h3 style='text-align: center '>Model Card</h3>
|
15 |
+
<img src="https://alt-web.xyz/images/rainbow.png" alt="Rainbow Solutions" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
|
16 |
+
|
17 |
GPT-NeoX-20B is a 20 billion parameter autoregressive language model trained
|
18 |
on [the Pile](https://pile.eleuther.ai/) using the [GPT-NeoX
|
19 |
library](https://github.com/EleutherAI/gpt-neox). Its architecture intentionally
|