achal-tri commited on
Commit
f81008b
1 Parent(s): 645f2d9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -15,7 +15,7 @@ DCLM-IT-1B is a 1.4B billion parameter language model trained on the DCLM-Baseli
15
 
16
  | Size | Training Tokens | Layers | Hidden Size | Attention Heads | Context Length |
17
  |:------:|:-----------------:|:--------:|:-------------:|:-----------------:|:----------------:|
18
- | 1.4B | 2.608T | 24 | 2048 | 16 | 2048 |
19
 
20
 
21
  ### Model Description
 
15
 
16
  | Size | Training Tokens | Layers | Hidden Size | Attention Heads | Context Length |
17
  |:------:|:-----------------:|:--------:|:-------------:|:-----------------:|:----------------:|
18
+ | 1.4B | 4.308T | 24 | 2048 | 16 | 2048 |
19
 
20
 
21
  ### Model Description