metadata
license: apache-2.0
datasets:
- stingning/ultrachat
- kaist-ai/CoT-Collection
- mesolitica/google-translate-commitpackft
- Wanfq/Explore_Instruct_Rewriting_32k
- Wanfq/Explore_Instruct_Rewriting_10k
- Wanfq/Explore_Instruct_Brainstorming_16k
- xiyuez/red-dot-design-award-product-description
RWKV v4 7B world model
finetuned with ultrachat , COT and some novel instructions data, commitpackft and so on
use full ultrachat and cot data, about 3B tokens
if you wanna do Role play, use this model
Contributor
Design of experiment
this model lose multi-turn chat ability,cause from using whole ultrachat datasets.
so i continue tuned multi-turn datasets with 2 aspects
Training details
CAses
Usage
adjust tempp and topp on different scenario.
COT and lookback
this model can do above task with 100% acc.