MolGen-large / README.md
ZJU-Fangyin's picture
Update README.md
51cb62c
|
raw
history blame
1.3 kB
metadata
tags:
  - molecular language model
  - SELFIES
  - molecule generation

MolGen

MolGen was introduced in the paper "Molecular Language Model as Multi-task Generator" and first released in this repository. It is a pre-trained molecular generative model built using the 100% robust molecular language representation, SELFIES.

Model description

MolGen is the first pre-trained model that only produces chemically valid molecules. With a training corpus of over 100 million molecules in SELFIES representation, MolGen learns the intrinsic structural patterns of molecules by mapping corrupted SELFIES to their original forms. Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder. Through its carefully designed multi-task molecular prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization.

BibTeX entry and citation info

@article{fang2023molecular,
  title={Molecular Language Model as Multi-task Generator},
  author={Fang, Yin and Zhang, Ningyu and Chen, Zhuo and Fan, Xiaohui and Chen, Huajun},
  journal={arXiv preprint arXiv:2301.11259},
  year={2023}
}