Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

AquilaChat2 long-text chat model AquilaChat2-34B-16k.

Inference

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
device = torch.device("cuda:0")
model_info = "h2oai/h2ogpt-16k-aquilachat2-34b"
tokenizer = AutoTokenizer.from_pretrained(model_info, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_info, trust_remote_code=True, torch_dtype=torch.bfloat16)
model.eval()
model.to(device)
text = "Who are you?"
from predict import predict
out = predict(model, text, tokenizer=tokenizer, max_gen_len=200, top_p=0.95,
              seed=1234, topk=100, temperature=0.9, sft=True, device=device,
              model_name="h2oai/h2ogpt-16k-aquilachat2-34b")
print(out)

License Aquila2 series open-source model is licensed under BAAI Aquila Model Licence Agreement

Downloads last month
657
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.

Space using h2oai/h2ogpt-16k-aquilachat2-34b 1