Edit model card

fa.intelligence.models.generative.novels.fiction

Description

This FrostAura Intelligence model is a fine-tuned version of EleutherAI/gpt-neox-20b for fictional text content generation.

Getting Started

PIP Installation

pip install -U --no-cache-dir transformers

Usage

from transformers import GPTNeoXForCausalLM, GPTNeoXTokenizerFast

model_name = 'FrostAura/gpt-neox-20b-fiction-novel-generation'
model = GPTNeoXForCausalLM.from_pretrained(model_name)
tokenizer = GPTNeoXTokenizerFast.from_pretrained(model_name)

prompt = 'GPTNeoX20B is a 20B-parameter autoregressive Transformer model developed by EleutherAI.'

input_ids = tokenizer(prompt, return_tensors="pt").input_ids

gen_tokens = model.generate(
  input_ids,
  do_sample=True,
  temperature=0.9,
  max_length=100,
)
gen_text = tokenizer.batch_decode(gen_tokens)[0]

print(f'Result: {gen_text}')

Further Fine-Tuning

in development

Support

If you enjoy FrostAura open-source content and would like to support us in continuous delivery, please consider a donation via a platform of your choice.

Supported Platforms Link
PayPal Donate via Paypal

For any queries, contact [email protected].

Downloads last month
20
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for FrostAura/gpt-neox-20b-fiction-novel-generation

Quantizations
1 model

Space using FrostAura/gpt-neox-20b-fiction-novel-generation 1