File size: 1,297 Bytes
75b2631 7298339 75b2631 089d787 51cb62c 089d787 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
---
tags:
- molecular language model
- SELFIES
- molecule generation
---
# MolGen
MolGen was introduced in the paper ["Molecular Language Model as Multi-task Generator"](https://arxiv.org/pdf/2301.11259.pdf) and first released in [this repository](https://github.com/zjunlp/MolGen). It is a pre-trained molecular generative model built using the 100\% robust molecular language representation, SELFIES.
## Model description
MolGen is the first pre-trained model that only produces chemically valid molecules.
With a training corpus of over 100 million molecules in SELFIES representation, MolGen learns the intrinsic structural patterns of molecules by mapping corrupted SELFIES to their original forms.
Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder.
Through its carefully designed multi-task molecular prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization.
### BibTeX entry and citation info
```bibtex
@article{fang2023molecular,
title={Molecular Language Model as Multi-task Generator},
author={Fang, Yin and Zhang, Ningyu and Chen, Zhuo and Fan, Xiaohui and Chen, Huajun},
journal={arXiv preprint arXiv:2301.11259},
year={2023}
}
``` |