Update README.md
Browse files
README.md
CHANGED
@@ -18,7 +18,7 @@ language:
|
|
18 |
|
19 |
<p align="center">
|
20 |
- <a href="https://www.nexaai.com/models" target="_blank">Nexa Model Hub</a>
|
21 |
-
- <a href="https://arxiv.org/
|
22 |
</p>
|
23 |
|
24 |
<p align="center" width="100%">
|
@@ -130,11 +130,14 @@ This multi-stage approach progressively enhances the model's ability to handle l
|
|
130 |
If you use Dolphin in your research, please cite our paper:
|
131 |
|
132 |
```bibtex
|
133 |
-
@article{
|
134 |
-
|
135 |
-
|
136 |
-
|
137 |
-
|
|
|
|
|
|
|
138 |
}
|
139 |
```
|
140 |
|
|
|
18 |
|
19 |
<p align="center">
|
20 |
- <a href="https://www.nexaai.com/models" target="_blank">Nexa Model Hub</a>
|
21 |
+
- <a href="https://arxiv.org/pdf/2408.15518" target="_blank">ArXiv</a>
|
22 |
</p>
|
23 |
|
24 |
<p align="center" width="100%">
|
|
|
130 |
If you use Dolphin in your research, please cite our paper:
|
131 |
|
132 |
```bibtex
|
133 |
+
@article{chen2024dolphinlongcontextnew,
|
134 |
+
title={Dolphin: Long Context as a New Modality for Energy-Efficient On-Device Language Models},
|
135 |
+
author={Wei Chen and Zhiyuan Li and Shuo Xin and Yihao Wang},
|
136 |
+
year={2024},
|
137 |
+
eprint={2408.15518},
|
138 |
+
archivePrefix={arXiv},
|
139 |
+
primaryClass={cs.CL},
|
140 |
+
url={https://arxiv.org/abs/2408.15518},
|
141 |
}
|
142 |
```
|
143 |
|