Edit model card

Uploaded model

  • Developed by: pmking27
  • License: apache-2.0
  • Finetuned from model : google/gemma-2b

This gemma model was trained 2x faster with Unsloth and Huggingface's TRL library.

Running Model:

# Importing necessary modules
from transformers import AutoModelForCausalLM, AutoTokenizer

# Setting the device to load the model onto (assuming GPU availability)
device = 'cuda'

# Loading the tokenizer for the model
tokenizer = AutoTokenizer.from_pretrained("pmking27/PrathameshLLM-2B")

# Loading the pre-trained model
model = AutoModelForCausalLM.from_pretrained("pmking27/PrathameshLLM-2B")

# Defining the Alpaca prompt template
alpaca_prompt = """
### Instruction:
{}

### Input:
{}

### Response:
{}"""

# Providing the input to the model
model_inputs = tokenizer(
[
    alpaca_prompt.format(
        '''
        You're an assistant trained to answer questions using the given context.
        
        context:

        General elections will be held in India from 19 April 2024 to 1 June 2024 to elect the 543 members of the 18th Lok Sabha. The elections will be held in seven phases and the results will be announced on 4 June 2024. This will be the largest-ever election in the world, surpassing the 2019 Indian general election, and will be the longest-held general elections in India with a total span of 44 days (excluding the first 1951–52 Indian general election). The incumbent prime minister Narendra Modi who completed a second term will be contesting elections for a third consecutive term.

        Approximately 960 million individuals out of a population of 1.4 billion are eligible to participate in the elections, which are expected to span a month for completion. The Legislative assembly elections in the states of Andhra Pradesh, Arunachal Pradesh, Odisha, and Sikkim will be held simultaneously with the general election, along with the by-elections for 35 seats among 16 states.
        ''', # instruction
        "भारतातील सार्वत्रिक निवडणुका किती टप्प्यात पार पडतील?", # input
        "", # output - leave this blank for generation!
    )
], return_tensors = "pt")

# Moving model inputs to the specified device
model_inputs.to(device)
model.to(device)

# Generating responses from the model
outputs = model.generate(**model_inputs, max_new_tokens=100)
decoded_output = tokenizer.batch_decode(outputs, skip_special_tokens=True)[0]

# Finding the start and end positions of the response
start_marker = "### Response:"
end_marker = "<eos>"
start_pos = decoded_output.find(start_marker) + len(start_marker)
end_pos = decoded_output.find(end_marker, start_pos)

# Extracting the response text
response_text = decoded_output[start_pos:end_pos].strip()

print(response_text)

Output:

भारतातील सार्वत्रिक निवडणुका 7 टप्प्यांमध्ये पार पडतील.

Downloads last month
16
Safetensors
Model size
2.51B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for pmking27/PrathameshLLM-2B

Base model

google/gemma-2b
Finetuned
(190)
this model
Quantizations
2 models

Collection including pmking27/PrathameshLLM-2B