|
--- |
|
license: gemma |
|
datasets: |
|
- ayoubkirouane/Small-Instruct-Alpaca_Format |
|
language: |
|
- en |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
## base model : |
|
- google/gemma-2-9b |
|
|
|
## dataset : |
|
- ayoubkirouane/Small-Instruct-Alpaca_Format |
|
|
|
## Get Started : |
|
|
|
- Load model directly : |
|
|
|
```python |
|
from transformers import AutoTokenizer, AutoModelForCausalLM |
|
|
|
tokenizer = AutoTokenizer.from_pretrained("ayoubkirouane/gemma-2-9b-alpaca-small-Instruct") |
|
model = AutoModelForCausalLM.from_pretrained("ayoubkirouane/gemma-2-9b-alpaca-small-Instruct") |
|
|
|
``` |
|
|
|
- Use a pipeline as a high-level helper : |
|
|
|
```python |
|
from transformers import pipeline |
|
|
|
pipe = pipeline("text-generation", model="ayoubkirouane/gemma-2-9b-alpaca-small-Instruct") |
|
|
|
``` |