RainyMotip-2x7B-AWQ / README.md
Suparious's picture
Added base_model tag in README.md
a946ffb verified
|
raw
history blame contribute delete
No virus
903 Bytes
metadata
base_model: Alsebay/RainyMotip-2x7B
license: apache-2.0
library_name: transformers
tags:
  - 4-bit
  - AWQ
  - text-generation
  - autotrain_compatible
  - endpoints_compatible
  - moe
  - merge
pipeline_tag: text-generation
inference: false
quantized_by: Suparious

Alsebay/RainyMotip-2x7B AWQ

Model Summary

What is it? A 2x7B MoE model for Roleplay(?).

You will occur GPT-like responses sometimes, just skip it and reroll (gacha time). Overall, I think it good enough for Roleplaying.

You may want see this: https://huggingface.co/Alsebay/My_LLMs_Leaderboard

This model is is a Mixure of Experts (MoE) made with the following models:

  • udkai/Turdus
  • Kquant03/Samlagast-7B-laser-bf16

If you used it, please let me know if it good or not. Thank you :)