Muennighoff
commited on
Commit
•
8f8468c
1
Parent(s):
e775da2
Update README.md
Browse files
README.md
CHANGED
@@ -15,7 +15,9 @@ co2_eq_emissions: 1
|
|
15 |
|
16 |
> OLMoE is a Mixture-of-Experts LLM with 1.2B active and 6.9B total parameters. It yields state-of-the-art performance among models with a similar cost (1B) and is competitive with much larger models like Llama2-13B. OLMoE is 100% open-source.
|
17 |
|
18 |
-
|
|
|
|
|
19 |
|
20 |
# Use
|
21 |
|
@@ -49,8 +51,6 @@ Important branches:
|
|
49 |
- `main`: Checkpoint annealed from `step1200000-tokens5033B` for an additional 100B tokens. We use this checkpoint for finetuning our chat model.
|
50 |
- `fp32`: FP32 version of `main`. The model weights were stored in FP32 during training but we did not observe any performance drop from casting them BF16 after training so we upload all weights in BF16. If you want the original FP32 checkpoint for `main` you can use this one. You will find that it yields slightly different results but should perform around the same on benchmarks.
|
51 |
|
52 |
-
The main branch contains the annealed checkpoint.
|
53 |
-
|
54 |
# Citation
|
55 |
|
56 |
```bibtex
|
|
|
15 |
|
16 |
> OLMoE is a Mixture-of-Experts LLM with 1.2B active and 6.9B total parameters. It yields state-of-the-art performance among models with a similar cost (1B) and is competitive with much larger models like Llama2-13B. OLMoE is 100% open-source.
|
17 |
|
18 |
+
- Code: https://github.com/allenai/OLMoE
|
19 |
+
- Paper:
|
20 |
+
- Logs:
|
21 |
|
22 |
# Use
|
23 |
|
|
|
51 |
- `main`: Checkpoint annealed from `step1200000-tokens5033B` for an additional 100B tokens. We use this checkpoint for finetuning our chat model.
|
52 |
- `fp32`: FP32 version of `main`. The model weights were stored in FP32 during training but we did not observe any performance drop from casting them BF16 after training so we upload all weights in BF16. If you want the original FP32 checkpoint for `main` you can use this one. You will find that it yields slightly different results but should perform around the same on benchmarks.
|
53 |
|
|
|
|
|
54 |
# Citation
|
55 |
|
56 |
```bibtex
|