--- language: - en pipeline_tag: text-generation library_name: ExLlamaV2 tags: - llama - llama-3 license: other license_name: llama3 license_link: https://huggingface.co/meta-llama/Meta-Llama-3-8B/blob/main/LICENSE base_model: R136a1/BeyondInfinity-4x7B --- # Exl2 quants for [L3-8B-Niitama-v1](https://huggingface.co/Sao10K/L3-8B-Niitama-v1) ## Automatically quantized using the auto quant script from [hf-scripts](https://huggingface.co/anthonyg5005/hf-scripts) Niitama is a model created by [Sao10k](https://huggingface.co/Sao10K). There's currently no card for this model other than the fact it was a horde model. ### BPW: [6.0](https://huggingface.co/Anthonyg5005/L3-8B-Niitama-v1-exl2/tree/6.0bpw)\ [6.5](https://huggingface.co/Anthonyg5005/L3-8B-Niitama-v1-exl2/tree/6.5bpw)\ [8.0](https://huggingface.co/Anthonyg5005/L3-8B-Niitama-v1-exl2/tree/8.0bpw)\ [measurement.json](https://huggingface.co/Anthonyg5005/L3-8B-Niitama-v1-exl2/blob/main/measurement.json) # How to download: ### oobabooga's downloader use something like [download-model.py](https://github.com/oobabooga/text-generation-webui/blob/main/download-model.py) to download with python requests.\ Install requirements: ```shell pip install requests tqdm ``` Example for downloading 8bpw: ```shell python download-model.py Anthonyg5005/L3-8B-Niitama-v1-exl2-exl2:8.0bpw ``` ### huggingface-cli You may also use huggingface-cli\ To install it, install python hf-hub ```shell pip install huggingface-hub ``` Example for 8bpw: ```shell huggingface-cli download Anthonyg5005/L3-8B-Niitama-v1-exl2 --local-dir L3-8B-Niitama-v1-exl2-8bpw --revision 8.0bpw ``` ### Git LFS (not recommended) I would recommend the http downloaders over using git, they can resume downloads if failed and are much easier to work with.\ Make sure to have git and git LFS installed.\ Example for 8bpw download with git: Have LFS file skip disabled ```shell # windows set GIT_LFS_SKIP_SMUDGE=0 # linux export GIT_LFS_SKIP_SMUDGE=0 ``` Clone repo branch ```shell git clone https://huggingface.co/Anthonyg5005/L3-8B-Niitama-v1-exl2 -b 8.0bpw ```