Edit model card

Typhoon-1.5-72B: Thai Large Language Model (Pretrained)

Typhoon-1.5-72B is a pretrained Thai ๐Ÿ‡น๐Ÿ‡ญ large language model with 72 billion parameters, and it is based on Qwen1.5-72B.

For release post, please see our blog.

Model Description

  • Model type: A 72B instruct decoder-only model based on Qwen1.5 archtecture.
  • Requirement: transformers 4.38.0 or newer.
  • Primary Language(s): Thai ๐Ÿ‡น๐Ÿ‡ญ and English ๐Ÿ‡ฌ๐Ÿ‡ง
  • License: Qwen License

Intended Uses & Limitations

This model is a pretrained base model. Thus, it may not be able to follow human instructions without using one/few-shot learning or instruction fine-tuning. The model does not have any moderation mechanisms, and may generate harmful or inappropriate responses.

Follow us

https://twitter.com/opentyphoon

Support / Ask any question

https://discord.gg/CqyBscMFpg

SCB10X AI Team

  • Kunat Pipatanakul, Potsawee Manakul, Sittipong Sripaisarnmongkol, Natapong Nitarach, Pathomporn Chokchainant, Kasima Tharnpipitchai
  • If you find Typhoon-72B useful for your work, please cite it using:
@article{pipatanakul2023typhoon,
    title={Typhoon: Thai Large Language Models}, 
    author={Kunat Pipatanakul and Phatrasek Jirabovonvisut and Potsawee Manakul and Sittipong Sripaisarnmongkol and Ruangsak Patomwong and Pathomporn Chokchainant and Kasima Tharnpipitchai},
    year={2023},
    journal={arXiv preprint arXiv:2312.13951},
    url={https://arxiv.org/abs/2312.13951}
}

Contact Us

Downloads last month
19
Safetensors
Model size
72.3B params
Tensor type
BF16
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including scb10x/typhoon-v1.5-72b