Text Generation
Transformers
PyTorch
English
mistral
text-generation-inference
Inference Endpoints
instruction-pretrain commited on
Commit
0af297c
1 Parent(s): 9feb0cd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -2,6 +2,7 @@
2
  license: apache-2.0
3
  datasets:
4
  - tiiuae/falcon-refinedweb
 
5
  language:
6
  - en
7
  ---
@@ -19,7 +20,7 @@ We augment the RefinedWeb corproa with instruction-response pairs generated by o
19
 
20
  To evaluate our general base model using the [lm-evaluation-harness framework](https://github.com/EleutherAI/lm-evaluation-harness)
21
 
22
- 1. Setup dependencies:
23
  ```bash
24
  git clone https://github.com/EleutherAI/lm-evaluation-harness
25
  cd lm-evaluation-harness
@@ -48,6 +49,8 @@ accelerate launch -m lm_eval --model hf \
48
 
49
  ## Citation
50
  If you find our work helpful, please cite us:
 
 
51
  ```bibtex
52
  @inproceedings{
53
  cheng2024adapting,
 
2
  license: apache-2.0
3
  datasets:
4
  - tiiuae/falcon-refinedweb
5
+ - instruction-pretrain/ft-instruction-synthesizer-collection
6
  language:
7
  - en
8
  ---
 
20
 
21
  To evaluate our general base model using the [lm-evaluation-harness framework](https://github.com/EleutherAI/lm-evaluation-harness)
22
 
23
+ 1. Setup dependencies
24
  ```bash
25
  git clone https://github.com/EleutherAI/lm-evaluation-harness
26
  cd lm-evaluation-harness
 
49
 
50
  ## Citation
51
  If you find our work helpful, please cite us:
52
+
53
+ [AdaptLLM](https://huggingface.co/papers/2309.09530):
54
  ```bibtex
55
  @inproceedings{
56
  cheng2024adapting,