File size: 2,551 Bytes
9ee2996 064d9fd 9ee2996 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 |
---
license: apache-2.0
---
<p style="font-size:20px;" align="center">
<div style="width: 100%; height: 300px; overflow: hidden; border-radius: 15px; margin: auto; position: relative;">
<img
src="https://cdn.shuttleai.app/thumbnail.png"
alt="ShuttleAI Thumbnail"
style="width: 100%; height: auto; display: block; margin: auto; position: absolute; top: 50%; left: 50%; transform: translate(-50%, -50%); object-fit: cover;">
</div>
<p align="center">
💻 <a href="https://shuttleai.app/" target="_blank">Use via API</a>
</p>
## Shuttle-2.5-mini [2024/07/26]
We are excited to introduce Shuttle-2.5-mini, our next-generation state-of-the-art language model designed to excel in complex chat, multilingual communication, reasoning, and agent tasks.
- **Shuttle-2.5-mini** is a fine-tuned version of [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407), emulating the writing style of Claude 3 models and thoroughly trained on role-playing data.
## Model Details
* **Model Name**: Shuttle-2.5-mini
* **Developed by**: ShuttleAI Inc.
* **Base Model**: [Mistral-Nemo-Base-2407](https://huggingface.co/mistralai/Mistral-Nemo-Base-2407)
* **Parameters**: 13B
* **Language(s)**: Multilingual
* **Repository**: [https://huggingface.co/shuttleai](https://huggingface.co/shuttleai)
* **Fine-Tuned Model**: [https://huggingface.co/shuttleai/shuttle-2.5-mini](https://huggingface.co/shuttleai/shuttle-2.5-mini)
* **Paper**: Shuttle-2.5-mini (Upcoming)
* **License**: Apache 2.0
## Base Model Architecture
**Mistral Nemo** is a transformer model with the following architecture choices:
- **Layers**: 40
- **Dimension**: 5,120
- **Head Dimension**: 128
- **Hidden Dimension**: 14,436
- **Activation Function**: SwiGLU
- **Number of Heads**: 32
- **Number of kv-heads**: 8 (GQA)
- **Vocabulary Size**: 2^17 (approximately 128k)
- **Rotary Embeddings**: Theta = 1M
### Key Features
- Released under the Apache 2 License
- Trained with a 128k context window
- Pretrained on a large proportion of multilingual and code data
- Finetuned to emulate the prose quality of Claude 3 models and extensively on role play data
## Fine-Tuning Details
- **Training Setup**: Trained on 4x A100 GPU for 2 epochs, totaling 24 hours.
## Prompting
Shuttle-2.5-mini uses ChatML as its prompting format:
```
<|im_start|>system
You are a pirate! Yardy harr harr!<|im_end|>
<|im_start|>user
Where are you currently!<|im_end|>
<|im_start|>assistant
Look ahoy ye scallywag! We're on the high seas!<|im_end|>
``` |