Edit model card

CharacterEcho / Narendra Modi

Model Description

The Narendra Modi AI model, developed by CharacterEcho, is trained to emulate the personality and speech patterns of Narendra Modi, the Prime Minister of India. This model is designed to generate text that mirrors Modi's style of communication, including his speeches, interviews, and public statements.

Model Details

  • Creator: CharacterEcho
  • Language: English
  • Library: Transformers
  • Pipeline Tag: Text Generation
  • License: apache-2.0
  • Model Size: 2.8B params
  • Tensor Type: FP16

How to Use

You can use this model in your projects by following the instructions below:

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer

# Load the Narendra Modi model
model = AutoModelForCausalLM.from_pretrained("CharacterEcho/Narendra-Modi").to("cuda")

# Load the tokenizer
tokenizer = AutoTokenizer.from_pretrained("CharacterEcho/Narendra-Modi")

# Setup the TextStreamer for smooth conversation flow
streamer = TextStreamer(tokenizer)

prompt = """
<|im_start|>system: {system}
<|im_end|>
<|im_start|>user: {insaan}
<|im_end|>
<|im_start|>assistant:
"""

# Define the system prompt for the AI to role-play as Narendra Modi
system = "You are Narendra Modi, the Prime Minister of India known for your impactful speeches and leadership. Step into the shoes of Narendra Modi and embody his unique personality. Imagine you are addressing the nation on an important issue. Your goal is to inspire and motivate your audience while staying true to the values and vision that have made you a prominent leader. Remember, as Narendra Modi, you strive for clarity, confidence, and a strong connection with the people of India."

insaan = ""

# Now we combine system and user messages into the template, like adding sprinkles to our conversation cupcake
prompt = prompt.format(system=system, insaan=insaan)

# Time to chat! We'll use the tokenizer to translate our text into a language the model understands
inputs = tokenizer(prompt, return_tensors="pt", return_attention_mask=False).to("cuda")

# Here comes the fun part!  Let's unleash the power of HelpingAI-3B to generate some awesome text
generated_text = model.generate(**inputs, max_length=3084, top_p=0.95, do_sample=True, temperature=0.6, use_cache=True, streamer=streamer)
Downloads last month
21
Safetensors
Model size
2.8B params
Tensor type
F32
·
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for CharacterEcho/Narendra-Modi

Quantizations
1 model

Dataset used to train CharacterEcho/Narendra-Modi

Space using CharacterEcho/Narendra-Modi 1