CrystalChat / README.md
yukiontheiceberg's picture
Update README.md
c5386f2
|
raw
history blame
2.89 kB
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
library_name: transformers
tags:
  - llm
  - code

CrystalChat

We present CrystalChat, an instruction following model finetuned from LLM360/CrystalCoder.

Model Trained Tokens ARC HellaSwag MMLU (5-shot) TruthfulQA Language Avg. HumanEval (pass@1) MBPP (pass@1) Coding Avg. Avg. of Avg.
Mistral 7B - 59.98 83.31 64.16 42.15 62.40 29.12 38.78 33.95 48.68
CrystalChat 7B 1.4T 51.71 76.12 53.22 47.29 57.08 34.12 39.11 36.62 46.85
CrystalCoder 7B 1.4T 47.01 71.97 48.78 35.91 50.92 28.38 36.38 32.38 41.65
CodeLlaMA 7B 2.5T 39.93 60.80 31.12 37.82 42.42 33.50 41.40 37.45 39.94
OpenLLaMA v2 7B 1T 43.60 72.20 41.29 35.54 48.18 15.32 12.69 28.01 38.10
LLaMA 2 7B 2T 53.07 77.74 43.80 38.98 53.39 13.05 20.09 16.57 34.98
StarCoder-15B 1.03 - - - - - 33.63 43.28 38.46 -

Model Description

Loading CrystalChat

import torch
from transformers import LlamaTokenizer, LlamaForCausalLM

tokenizer = LlamaTokenizer.from_pretrained("LLM360/CrystalChat/", trust_remote_code=True)
model = LlamaForCausalLM.from_pretrained("LLM360/CrystalChat", trust_remote_code=True)

prompt = 'int add(int x, int y) {'

input_ids = tokenizer(prompt, return_tensors="pt").input_ids
gen_tokens = model.generate(input_ids, do_sample=True, max_length=400)

print("-"*20 + "Output for model"  + 20 * '-')
print(tokenizer.batch_decode(gen_tokens)[0])

Citation

BibTeX:

@misc{liu2023llm360,
      title={LLM360: Towards Fully Transparent Open-Source LLMs}, 
      author={Zhengzhong Liu and Aurick Qiao and Willie Neiswanger and Hongyi Wang and Bowen Tan and Tianhua Tao and Junbo Li and Yuqi Wang and Suqi Sun and Omkar Pangarkar and Richard Fan and Yi Gu and Victor Miller and Yonghao Zhuang and Guowei He and Haonan Li and Fajri Koto and Liping Tang and Nikhil Ranjan and Zhiqiang Shen and Xuguang Ren and Roberto Iriondo and Cun Mu and Zhiting Hu and Mark Schulze and Preslav Nakov and Tim Baldwin and Eric P. Xing},
      year={2023},
      eprint={2312.06550},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}