metadata
license: mit
license_link: https://huggingface.co/microsoft/Phi-3-medium-4k-instruct/resolve/main/LICENSE
language:
- sq
library_name: transformers
pipeline_tag: text-generation
tags:
- nlp
- code
- llama-cpp
- gguf-my-lora
inference:
parameters:
temperature: 0.7
widget:
- messages:
- role: user
content: >-
Identifiko emrat e personave në këtë artikull 'Majlinda Kelmendi
(lindi më 9 maj 1991), është një xhudiste shqiptare nga Peja, Kosovë.'
base_model: Kushtrim/Phi-3-medium-4k-instruct-sq
NikolayKozloff/Phi-3-medium-4k-instruct-sq-F16-GGUF
This LoRA adapter was converted to GGUF format from Kushtrim/Phi-3-medium-4k-instruct-sq
via the ggml.ai's GGUF-my-lora space.
Refer to the original adapter repository for more details.
Use with llama.cpp
# with cli
llama-cli -m base_model.gguf --lora Phi-3-medium-4k-instruct-sq-f16.gguf (...other args)
# with server
llama-server -m base_model.gguf --lora Phi-3-medium-4k-instruct-sq-f16.gguf (...other args)
To know more about LoRA usage with llama.cpp server, refer to the llama.cpp server documentation.