ZJU-Fangyin
commited on
Commit
•
58b548a
1
Parent(s):
0abfc14
Update README.md
Browse files
README.md
CHANGED
@@ -7,14 +7,14 @@ widget:
|
|
7 |
- text: '[C][=C][C][=C][C][=C][Ring1][=Branch1]'
|
8 |
inference: false
|
9 |
---
|
10 |
-
# MolGen
|
11 |
-
MolGen was introduced in the paper ["Molecular Language Model as Multi-task Generator"](https://arxiv.org/pdf/2301.11259.pdf) and first released in [this repository](https://github.com/zjunlp/MolGen). It is a pre-trained molecular generative model built using the 100\% robust molecular language representation, SELFIES.
|
12 |
|
13 |
## Model description
|
14 |
-
MolGen is the first pre-trained model that only produces chemically valid molecules.
|
15 |
-
With a training corpus of over 100 million molecules in SELFIES representation, MolGen learns the intrinsic structural patterns of molecules by mapping corrupted SELFIES to their original forms.
|
16 |
-
Specifically, MolGen employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder.
|
17 |
-
Through its carefully designed multi-task molecular prefix tuning (MPT), MolGen can generate molecules with desired properties, making it a valuable tool for molecular optimization.
|
18 |
|
19 |
![image.png](./model.png)
|
20 |
|
|
|
7 |
- text: '[C][=C][C][=C][C][=C][Ring1][=Branch1]'
|
8 |
inference: false
|
9 |
---
|
10 |
+
# MolGen-large
|
11 |
+
MolGen-large was introduced in the paper ["Molecular Language Model as Multi-task Generator"](https://arxiv.org/pdf/2301.11259.pdf) and first released in [this repository](https://github.com/zjunlp/MolGen). It is a pre-trained molecular generative model built using the 100\% robust molecular language representation, SELFIES.
|
12 |
|
13 |
## Model description
|
14 |
+
MolGen-large is the first pre-trained model that only produces chemically valid molecules.
|
15 |
+
With a training corpus of over 100 million molecules in SELFIES representation, MolGen-large learns the intrinsic structural patterns of molecules by mapping corrupted SELFIES to their original forms.
|
16 |
+
Specifically, MolGen-large employs a bidirectional Transformer as its encoder and an autoregressive Transformer as its decoder.
|
17 |
+
Through its carefully designed multi-task molecular prefix tuning (MPT), MolGen-large can generate molecules with desired properties, making it a valuable tool for molecular optimization.
|
18 |
|
19 |
![image.png](./model.png)
|
20 |
|