README / README.md
DakMak's picture
Update README.md
2249d22
|
raw
history blame
740 Bytes
metadata
title: README
emoji: 🐒
colorFrom: red
colorTo: gray
sdk: static
pinned: false

Artefact Link Type Details πŸ₯‡ Falcon-40B Here pretrained model 40B parameters trained on 1,000 billion tokens. Falcon-40B-Instruct Here instruction/chat model Falcon-40B finetuned on the Baize dataset. πŸ₯ˆ Falcon-7B Here pretrained model 6.7B parameters trained on 1,500 billion tokens. Falcon-7B-Instruct Here instruction/chat model Falcon-7B finetuned on the Baize, GPT4All, and GPTeacher datasets. πŸ“€ RefinedWeb Here pretraining web dataset ~600 billion "high-quality" tokens. Falcon-RW-1B Here pretrained model 1.3B parameters trained on 350 billion tokens. Falcon-RW-7B Here pretrained model 7.5B parameters trained on 350 billion tokens.