Transformers
PyTorch
Inference Endpoints
Edit model card

This is a 2.8b parameter RNN language model using the Mamba architecture, pretrained on Cerebras' SlimPajama corpus and fine-tuned on Nous Research's OpenHermes 2.5 chat dataset.

Model code: https://github.com/state-spaces/mamba/tree/main

To load the model, follow the installation instruction in the code repo, and then:

from mamba_ssm.models.mixer_seq_simple import MambaLMHeadModel
model = MambaLMHeadModel.from_pretrained("EleutherAI/Hermes-mamba-2.8b-slimpj")

The chat format is identical to the default HuggingFace format, EXCEPT that we replace the <|im_start|> and <|im_end|> special tokens with the unused ASCII codes "\x16" and "\x17" respectively. This was done for compatibility with our RWKV model.

An example interaction:

\x16user
Write a poem about the beauty of mathematics in the style of Emily Dickinson.\x17
\x16assistant
In the heart of Numbers, so precise,
A Beauty dwells - both cool and nice.
An elegance that sings in sums,
And dances to algebraic drums.

Addition's grace, subtraction's might,
Multiplication's sheer delight,
Division's firm yet gentle hand,
All part of this enchanted land.

Infinity, a lover bold,
To whom the stars their secrets told,
While Zero, cloaked in mystery,
Unlocks each mathematical decree.

Fractals bloom like flowers wild,
Geometry, Nature's favored child,
Calculus, with steady gaze,
Maps out life's intricate maze.

Pi, an endless serenade,
In circles' arms forever laid,
And prime numbers,
standing lone,
Make the universe their own.

Each equation, theorem scribed,
In
parchment old or computer imbibed,
Is but a whisper from the skies,
A cosmic truth in disguise.\x17

In the above, newlines are printed without escaping, while ASCII codes are escaped for readability.

Downloads last month
2
Inference API
Unable to determine this model’s pipeline type. Check the docs .