YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
phi-2 - bnb 8bits
- Model creator: https://huggingface.co/susnato/
- Original model: https://huggingface.co/susnato/phi-2/
Original model description:
license: mit license_name: microsoft-research-license license_link: LICENSE
DISCLAIMER: I don't own the weights to this model, this is a property of Microsoft and taken from their official repository : microsoft/phi-2.
The sole purpose of this repository is to use this model through the transformers
API or to load and use the model using the HuggingFace transformers
library.
Usage
First make sure you have the latest version of the transformers
installed.
pip uninstall -y transformers && pip install git+https://github.com/huggingface/transformers
Then use the transformers library to load the model from the library itself
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("susnato/phi-2")
tokenizer = AutoTokenizer.from_pretrained("susnato/phi-2")
inputs = tokenizer('''def print_prime(n):
"""
Print all primes between 1 and n
"""''', return_tensors="pt", return_attention_mask=False)
outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)
- Downloads last month
- 2