File size: 1,389 Bytes
bf12c47
 
 
2341857
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2ff2c53
 
 
2341857
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
---
license: apache-2.0
---
# Mobius Mega 12B 128K base

## Introduction

Mobius is a RWKV v5.2 arch model, a state based RNN+CNN+Transformer Mixed language model pretrained on a certain amount of data.
In comparison with the previous released Mobius, the improvements include:

* Only 24G Vram to run this model locally with fp16;
* Significant performance improvement in chat model;
* Multilingual support ;
* Stable support of 128K context length.
* chat model [Mobius-12B-128k-chat](https://huggingface.co/TimeMobius/Mobius-Chat-12B-128k)
  

## Usage
We do not advise you to use this model, please use Chat model instead, this model intend to be trained with SFT and instruction tuning.

## More details
Mobius 12B 128k based on RWKV v5.2 arch, which is leading state based RNN+CNN+Transformer Mixed language large language model which focus opensouce community
* 10~100 trainning/inference cost reduce;
* state based,selected memory, which mean good at grok;
* community support.

## requirements
24G vram to run fp16, 12G for int8, 6G for nf4 with Ai00 server.

* [RWKV Runner](https://github.com/josStorer/RWKV-Runner)
* [Ai00 server](https://github.com/cgisky1980/ai00_rwkv_server)

## Trainning details
pretrained with 100B high quality datasets.

## future plan
If you need a HF version let us know

[Mobius-Chat-12B-128k](https://huggingface.co/TimeMobius/Mobius-Chat-12B-128k)