Selocan-2x7B-v1-AWQ / README.md
Suparious's picture
Updated and moved existing to merged_models base_model tag in README.md
f51007d verified
metadata
base_model: Locutusque/Selocan-2x7B-v1
inference: false
library_name: transformers
license: apache-2.0
merged_models:
  - TURKCELL/Turkcell-LLM-7b-v1
  - NovusResearch/Novus-7b-tr_v1
pipeline_tag: text-generation
quantized_by: Suparious
tags:
  - 4-bit
  - AWQ
  - text-generation
  - autotrain_compatible
  - endpoints_compatible
  - moe
  - frankenmoe
  - merge
  - mergekit
  - lazymergekit
  - TURKCELL/Turkcell-LLM-7b-v1
  - NovusResearch/Novus-7b-tr_v1

ozayezerceli/Selocan-2x7B-v1 AWQ

Model Summary

Selocan-2x7B-v1 is a Mixture of Experts (MoE) made with the following models using LazyMergekit: