Edit model card

code-gemma

Google's gemma-2b-it trained code_instructions_122k_alpaca_style dataset

Usage

# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="gnumanth/code-gemma")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("gnumanth/code-gemma")
model = AutoModelForCausalLM.from_pretrained("gnumanth/code-gemma")

Hemanth HM

Downloads last month
7
Safetensors
Model size
2.51B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.