dragoman-4bit / README.md
darkproger's picture
Update README.md
c03e03f verified
metadata
language:
  - uk
  - en
license: apache-2.0
library_name: peft
tags:
  - translation
  - mlx
datasets:
  - Helsinki-NLP/opus_paracrawl
  - turuta/Multi30k-uk
metrics:
  - bleu
pipeline_tag: text-generation
base_model: mistralai/Mistral-7B-v0.1
inference: false
model-index:
  - name: Dragoman
    results:
      - task:
          type: translation
          name: English-Ukrainian Translation
        dataset:
          name: FLORES-101
          type: facebook/flores
          config: eng_Latn-ukr_Cyrl
          split: devtest
        metrics:
          - type: bleu
            value: 32.34
            name: Test BLEU

lang-uk/dragoman-4bit

This model was converted to MLX format from the lang-uk/dragoman adapter fused into the mistralai/Mistral-7b-v0.1 base model and quantized into 4 bits using mlx-lm version 0.4.0. Refer to the original model card for more details on the model.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("lang-uk/dragoman-4bit")
response = generate(model, tokenizer, prompt="[INST] who holds this neighborhood? [/INST]", verbose=True)

Or use from your shell:

python -m mlx_lm.generate --model lang-uk/dragoman-4bit --prompt '[INST] who holds this neighborhood? [/INST]' --temp 0 --max-tokens 100