license: apache-2.0
Mobius-rwkv-role-play-multiturn-12B-128k based on RWKV v5.2 arch, good at Role play and novel writing.
Use format below to achieve multi characters group chat.
System: xxxxx (this is optional)
User: {character Name 1}: xxxxx (* optional: think & action around by star *)
Assistant: {character Name 2}: xxxxx
temperature & topp: 0.7 0.6/1 0.3/1.5 0.3/0.2 0.8
if encounter repeat, make presence penalty higher.
Mobius Chat 12B 128K
Introduction
Mobius is a RWKV v5.2 arch model, a state based RNN+CNN+Transformer Mixed language model pretrained on a certain amount of data. In comparison with the previous released Mobius, the improvements include:
- Only 24G Vram to run this model locally with fp16;
- Significant performance improvement;
- Multilingual support ;
- Stable support of 128K context length.
- Base model Mobius-mega-12B-128k-base
Usage
We encourage you use few shots to use this model, Desipte Directly use User: xxxx\n\nAssistant: xxx\n\n is really good too, Can boost all potential ability.
Recommend Temp and topp: 0.7 0.6/1 0.3/1.5 0.3/0.2 0.8
More details
Mobius 12B 128k based on RWKV v5.2 arch, which is leading state based RNN+CNN+Transformer Mixed large language model which focus opensouce community
- 10~100 trainning/inference cost reduce;
- state based,selected memory, which mean good at grok;
- community support.
requirements
24G vram to run fp16, 12G for int8, 6G for nf4 with Ai00 server.
future plan
If you need a HF version let us know