aashish1904's picture
Upload README.md with huggingface_hub
3d98f63 verified
metadata
language:
  - ja
tags:
  - causal-lm
  - not-for-all-audiences
  - nsfw
pipeline_tag: text-generation

QuantFactory Banner

QuantFactory/Berghof-NSFW-7B-GGUF

This is quantized version of Elizezen/Berghof-NSFW-7B created using llama.cpp

Original Model Card

Berghof NSFW 7B

drawing

Model Description

ε€šεˆ†γ“γ‚ŒγŒδΈ€η•ͺ強いと思います

Usage

Ensure you are using Transformers 4.34.0 or newer.

import torch
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("Elizezen/Berghof-NSFW-7B")
model = AutoModelForCausalLM.from_pretrained(
  "Elizezen/Berghof-NSFW-7B",
  torch_dtype="auto",
)
model.eval()

if torch.cuda.is_available():
    model = model.to("cuda")

input_ids = tokenizer.encode(
    "εΎθΌ©γ―ηŒ«γ§γ‚γ‚‹γ€‚εε‰γ―γΎγ γͺい",, 
    add_special_tokens=True, 
    return_tensors="pt"
)

tokens = model.generate(
    input_ids.to(device=model.device),
    max_new_tokens=512,
    temperature=1,
    top_p=0.95,
    do_sample=True,
)

out = tokenizer.decode(tokens[0][input_ids.shape[1]:], skip_special_tokens=True).strip()
print(out)

Intended Use

The model is mainly intended to be used for generating novels. It may not be so capable with instruction-based responses.