Edit model card

Mixtral_AI_TheAncient-GGUF

Quantized GGUF model files for Mixtral_AI_TheAncient from LeroyDyer

Original Model Card:

LeroyDyer/Mixtral_AI_TheAncient

The ancient one - The knowlegde of self and the past is what has shaped us .. the search for truth ... are we alone and have we always been alone: the idea of god or gods the percetion of angels or giants ? or the advanced ones : the ancient past is filled with mystrys !!

Lets unravell them together with this model !! << Full of ancient books >> from ancient khemet to canan and baylon and china and india !! even ancient myths of the celts and gaelic the bible has been uploaded with various versions as well as some translations in other african language bibles as this is the true source of the languge and cultures of the people of the bible and the ancient past : in the decyphre of history we need to have understanding of these ancient languges , as the names and places are based of these forms of speech: also its prucdent to include the classical prose by plato and other to increse the understanding of world perceptions as well as undertanding where some of this fantastical thought came from ! the dreamers: the naghamadi library will be updated soon as well as other egyptian prose and tablets as there are many .. currently focusing on the historys of the world and geographic locations : and currently known historys (foundations) no extra discussion has been given allowing for themodel to make unbiased choices on the info !

Uploaded model

  • Developed by: LeroyDyer
  • License: apache-2.0
  • Finetuned from model : LeroyDyer/Mixtral_AI_CYBER_WORLD_

This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.

Downloads last month
49
GGUF
Model size
7.24B params
Architecture
llama

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.