text
stringlengths 0
12.7k
| source
stringclasses 1
value |
---|---|
Interactive model evaluation with Trident Chemwidgets In this tutorial we'll build on the Introduction to Graph Covolutions tutorial to show how you can use the Trident Chemwidgets (TCW) package to interact with and test the model you've trained. Evaluating models on new data, including corner cases, is a critical step toward model deployment. However, generating new molecules to test in an interactive way is rarely straightforward. TCW provides several tools to help subset larger datasets and draw new molecules to test against your models. You can find the full documentation for the Trident Chemwidgets library here. Colab This tutorial and the rest in this sequence are designed to be done in Google colab. If you'd like to open this notebook in colab, you can use the following link. Installing the prerequisites ! pip install tensorflow deepchem trident-chemwidgets seaborn For this tutorial, you'll need Trident Chemwidgets version 0. 2. 0 or greater. We can check the installed version with the following command: import trident_chemwidgets as tcw print ( tcw. __version__ ) 0. 2. 1 Throughout this tutorial, we'll use the convention tcw to call the classes from the Trident Chemwidgets package. Exploring the data We'll start out by loading the Tox21 dataset and extracting the predefined train, validation, and test splits. import deepchem as dc tasks, datasets, transformers = dc. molnet. load_tox21 ( featurizer = 'Graph Conv' ) train_dataset, valid_dataset, test_dataset = datasets We can then use RDKit to calculate some additional features for each of the training examples. Specifically, we'll compute the log P and molecular weight of each molecule and return this new data in a dataframe. import rdkit. Chem as Chem from rdkit. Chem. Crippen import Mol Log P from rdkit. Chem. Descriptors import Mol Wt import pandas as pd data = [] for dataset, split in zip ( datasets, [ 'train', 'valid', 'test' ]): for smiles in dataset. ids : mol = Chem. Mol From Smiles ( smiles ) logp = Mol Log P ( mol ) mwt = Mol Wt ( mol ) data. append ({ 'smiles' : smiles, 'logp' : logp, 'mwt' : mwt, 'split' : split }) mol_data = pd. Data Frame ( data ) mol_data. head () [15:36:55] WARNING: not removing hydrogen atom without neighbors | deepchem.pdf |
smiles logp mwt split 0 CC(O)(P(=O)(O)O)P(=O)(O)O-0. 9922 206. 027 train 1 CC(C)(C)OOC(C)(C)CCC(C)(C)OOC(C)(C)C 4. 8172 290. 444 train 2 OC[C@H](O)[C@@H](O)[C@H](O)CO-2. 9463 152. 146 train 3 CCCCCCCC(=O)[O-]. CCCCCCCC(=O)[O-]. [Zn+2] 2. 1911 351. 802 train 4 CC(C)COC(=O)C(C)C 1. 8416 144. 214 train One-dimensional distributions We can examine one-dimensional distributions using a histogram. Unlike histograms from static plotting libraries like Matplotlib or Seaborn, the TCW Histogram provides interactive functionality. TCW enables subsetting of the data, plotting chemical structures in a gallery next to the plot, and saving a reference to the subset portion of the dataframe. Unfortunately, this interactivity comes at the price of portability, so we have included screenshots for this tutorial in addition to providing the code to generate the interactive visuals. If you run this tutorial yourself (either locally or on Colab), you'll be able to display and interact with full demo plots. In the plot below, you can see the histogram of the molecular weight distribution from the combined dataset on the left. If you click and drag within plot area in the live widget, you can subset a portion of the distribution for further examination. The background of the selected portion will turn gray and the selected data points will be shown in teal within the bars of the plot. The x axis of the Histogram widget is compatible with either numeric or date data types, which makes it a convenient choice for splitting your ML datasets based on a property or the date the experimental data were collected. | deepchem.pdf |
Histogram example To generate an interactive example of the widget, run the next cell: hist = tcw. Histogram ( data = mol_data, smiles = 'smiles', x = 'mwt' ) hist Histogram(data={'points': [{'smiles': 'CC(O)(P(=O)(O)O)P(=O)(O)O', 'x': 206. 027, 'index': 0}, {'smiles': 'CC(C... If you select subset of the data by clicking and dragging, you can view the selected structures in the gallery to the right by pressing the SHOW STRUCTURES button beneath the plot. You can extract this subset of the original dataframe by pressing SAVE SELECTION and accessing the hist. selection property as shown in the next cell. This workflow is convenient for applications like data splitting based on a single dimension. hist. selection smiles logp mwt split Two- or three-dimensional distributions In addition to histograms, TCW also has provides a class for scatter plots. The Scatter class is useful when comparing two or three dimensions or your data. As of v0. 2. 0, TCW Scatter supports the use of the x and y axes as well as the color of each point ( hue keyword) to represent either continuous or discrete variables. Just like in the Histogram example, you can click and drag within the plot area to subset along the x and y axes. The Scatter widget also supports dates along the x, y, and hue axes. In the image below, we have selected a portion of dataset with large molecular weight values, but minimal training examples (displayed points in orange), to demonstrate how the Scatter widget can be useful for outlier identification. In | deepchem.pdf |
addition to selection by bounding box, you can also hover over individual points to display a drawing of the underlying structure. Scatter example To generate an interactive example of the widget, run the next cell: scatter = tcw. Scatter ( data = mol_data, smiles = 'smiles', x = 'mwt', y = 'logp', hue = 'split' ) scatter Scatter(data={'points': [{'smiles': 'CC(O)(P(=O)(O)O)P(=O)(O)O', 'x': 206. 027, 'y': -0. 9922000000000002, 'hue'... If you select subset of the data by clicking and dragging, you can view the selected structures in the gallery to the right by pressing the SHOW STRUCTURES button beneath the plot. You can extract this subset of the original dataframe by pressing SAVE SELECTION and accessing the scatter. selection property as shown in the next cell. scatter. selection smiles logp mwt split Training a Graph Conv Model Now that we've had a look at the training data, we can train a Graph Conv Model to predict the 12 Tox21 classes. We'll replicate the training procedure exactly from the Introduction to Graph Covolutions tutorial. We'll train for 50 epochs, just as in the original tutorial. # The next line filters tensorflow warnings relevant to memory consumption. # To see these warnings, comment the next line. import warnings ; warnings. filterwarnings ( 'ignore' ) | deepchem.pdf |
# Now we'll set the tensorflow seed to make sure the results of this notebook are reproducible import tensorflow as tf ; tf. random. set_seed ( 27 ) n_tasks = len ( tasks ) model = dc. models. Graph Conv Model ( n_tasks, mode = 'classification' ) model. fit ( train_dataset, nb_epoch = 50 ) 2022-06-29 15:37:03. 915828: I tensorflow/core/platform/cpu_feature_guard. cc:193] This Tensor Flow binary is optim ized with one API Deep Neural Network Library (one DNN) to use the following CPU instructions in performance-criti cal operations: AVX2 FMA To enable them in other operations, rebuild Tensor Flow with the appropriate compiler flags. 0. 2661594772338867 Now that we have a trained model, we can check AUROC values for the training and test datasets: metric = dc. metrics. Metric ( dc. metrics. roc_auc_score ) print ( f 'Training set score: { model. evaluate ( train_dataset, [ metric ], transformers )[ "roc_auc_score" ] :. 2f } ' ) print ( f 'Test set score: { model. evaluate ( test_dataset, [ metric ], transformers )[ "roc_auc_score" ] :. 2f } ' ) Training set score: 0. 97 Test set score: 0. 68 Just as in the original tutorial, we see that the model performs reasonably well on the predefined train/test splits. Now we'll use this model to evaluate compounds that are outside the training distribution, just as we might in a real-world drug discovery scenario. Evaluating the model on new data One of the challenging first steps toward deploying an ML model in production is evaluating it on new data. Here, new data refers to both data outside the initial train/val/test distributions and also data that may not be already processed for use with the model. We can use the JSME widget provided by TCW to quickly test our model again some molecules of interest. We'll start with a known therapeutic molecule: ibuprofen. We can see that ibuprofen is not included in any of the datasets that we have evaluated our model against so far: print ( f "Ibuprofen structure in Tox21 dataset: { 'CC(C)CC1=CC=C(C=C1)C(C)C(=O)O' in mol_data [ 'smiles' ] } " ) Ibuprofen structure in Tox21 dataset: False To simulate a drug discovery application, let's say you're a chemist tasked with identifying potential new therapeutics derived from ibuprofen. Ideally, the molecules you test would have limited toxicity. You've just developed the model above to predict the tox outcomes from Tox21 data and now you want to use it to do some first-pass screening of your derivatives. The standard workflow for a task like this might include drawing the molecules in a program like Chem Draw, exporting to SMILES format, importing into the notebook, then prepping the data and running it through your model. With TCW, we can shortcut the first few steps of that workflow by using the JSME widget to draw molecules and convert to SMILES directly in the notebook. We can even use the base_smiles argument to specify a base molecular structure, which is great for generating derivatives. Here we'll set the base_smiles value to 'CC(C)CC1=CC=C(C=C1)C(C)C(=O)O', the SMILES string for ibuprofen. Below is a screenshot using JSME to generate a few derivative molecules to test against our toxicity model. | deepchem.pdf |
JSME example To generate your own set of derivatives, run the cell below. To add a SMILES string to the saved set, click the ADD TO SMILES LIST button below the interface. If you want to regenerate the original base molecule, in this case ibuprofen, click the RESET TO BASE SMILES button below the interface. By using this button, it's easy to generate distinct derivatives from a shared starting structure. Go ahead and create some ibuprofen derivatives to test against the tox model: jsme = tcw. JSME ( base_smiles = 'CC(C)CC1=CC=C(C=C1)C(C)C(=O)O' ) jsme JSME(base_smiles='CC(C)CC1=CC=C(C=C1)C(C)C(=O)O') You can access the smiles using the jsme. smiles property. This call will return a list of the SMILES strings that have been added to the SMILES list of the widget (the ones shown in the molecule gallery to the right of the JSME interface). print ( jsme. smiles ) [] To ensure the rest of this notebook runs correctly, the following cell sets the new test SMILES set to the ones from the screenshot above in the case that you have not defined your own set using the widget. Otherwise, it will use the molecules you have drawn. # This cell will provide a preset list of SMILES strings in case you did not create your own. if len ( jsme. smiles ) > 1 : drawn_smiles = jsme. smiles else : drawn_smiles = [ 'CC(C)Cc1ccc(C(C)C(=O)O)cc1', 'CC(C)C(S)c1ccc(C(C)C(=O)O)cc1', 'CCSC(c1ccc(C(C)C(=O)O)cc1)C(C)CC', | deepchem.pdf |
'CCSC(c1ccc(C(C)C(=O)O)cc1)C(C)C(=O)O', 'CC(C(=O)O)c1ccc(C(S)C(C)C(=O)O)cc1' ] Next we have to create a dataset that is compatible with our model to test these new molecules. featurizer = dc. feat. Conv Mol Featurizer () loader = dc. data. In Memory Loader ( tasks = list ( train_dataset. tasks ), featurizer = featurizer ) dataset = loader. create_dataset ( drawn_smiles, shard_size = 1 ) Finally, we can generate our predictions of positive results here and plot them. predictions = model. predict ( dataset, transformers )[:, :, 1 ] import seaborn as sns sns. heatmap ( predictions, vmin = 0, vmax = 1 ) <Axes Subplot:> Now we can get the predicted most toxic compound/assay result for further inspection. Below we extract the highest predicted positive hit (most toxic) and display the assay name, SMILES string, and an image of the structure. import numpy as np mol_idx, assay_idx = np. unravel_index ( predictions. argmax (), predictions. shape ) smiles = drawn_smiles [ mol_idx ] print ( f 'Most toxic result (predicted): { train_dataset. tasks [ assay_idx ] }, { smiles } ' ) mol = Chem. Mol From Smiles ( smiles ) mol Most toxic result (predicted): NR-ER, CC(C)Cc1ccc(C(C)C(=O)O)cc1 Interpreting the model's predictions Often predictions alone are insufficient to decide whether to move forward with costly experiments. We might also want some metric or metrics that allow us to interpret the model's output. Building on the tutorial Calculating Atomic Contributions for Molecules Based on a Graph Convolutional QSAR Model, we can calculate the relative contribution of each atom in a molecule to the predicted output value. This attribution strategy enables us to determine whether the molecular features that a chemist may identify as important and those most affecting the predictions are in alignment. If the chemist's interpretation and the model's interpretation metrics are consistent, that may indicate that the model is a good fit for the task at hand. However, the inverse is not necessarily true either. A model may have the capacity to make accurate predictions that a trained chemist cannot fully understand. This is just one tool in a machine learning practitioner's toolbox. We'll start by using the built-in per_atom_fragmentation argument for the Conv Mol Featurizer. This will generate a list of Conv Mol objects that have each had a single atom removed. featurizer = dc. feat. Conv Mol Featurizer ( per_atom_fragmentation = True ) mol_list = featurizer ( smiles ) loader = dc. data. In Memory Loader ( tasks = list ( train_dataset. tasks ), featurizer = dc. feat. Dummy Featurizer ()) | deepchem.pdf |
dataset = loader. create_dataset ( mol_list [ 0 ], shard_size = 1 ) We can then run these predictions through the model and retrieve the predicted values for the molecule and assay specified in the last section. full_molecule_prediction = predictions [ mol_idx, assay_idx ] fragment_predictions = model. predict ( dataset, transformers )[:, assay_idx, 0 ] contributions = pd. Data Frame ({ 'Change in predicted toxicity' : ( full_molecule_prediction - fragment_predictions ). round ( 3 ) }) We can use the Interactive Molecule widget from TCW to superimpose the contribution scores on the molecule itself, allowing us to easily asses the relative importance of each atom to the final prediction. If you click on one of the atoms, you can retrieve the contribution data in a card shown to the right of the structure. In this panel you can also select a variable by which to color the atoms in the plot. Interactive Molecule example You can generate the interactive widget by running the cell below. tcw. Interactive Molecule ( smiles, data = contributions ) Interactive Molecule(data=[{'Change in predicted toxicity': 0. 5529999732971191}, {'Change in predicted toxicity... Wrapping up In this tutorial, we learned how to incorporate Trident Chemwidgets into your Deep Chem-based ML workflow. While TCW was built with molecular ML workflows in mind, the library also works well for general cheminformatics notebooks as | deepchem.pdf |
well. Star Trident Chemwidgets on Git Hub If you ο¬nd the Trident Chemwidgets package helpful please give it a β on Git Hub. Starring the project helps it grow and find new audiences. Congratulations! Time to join the Community! Congratulations on completing this tutorial notebook! If you enjoyed working through the tutorial, and want to continue working with Deep Chem, we encourage you to finish the rest of the tutorials in this series. You can also help the Deep Chem community in the following ways: Star Deep Chem on Git Hub This helps build awareness of the Deep Chem project and the tools for open source drug discovery that we're trying to build. Join the Deep Chem Gitter The Deep Chem Gitter hosts a number of scientists, developers, and enthusiasts interested in deep learning for the life sciences. Join the conversation! | deepchem.pdf |
Tutorial: Chem BERTa: Large-Scale Self-Supervised Pretraining for Molecular Property Prediction using a Smiles Tokenization Strategy By Seyone Chithrananda ( Twitter ) Deep learning for chemistry and materials science remains a novel field with lots of potiential. However, the popularity of transfer learning based methods in areas such as natural language processing (NLP) and computer vision have not yet been effectively developed in computational chemistry + machine learning. Using Hugging Face's suite of models and the Byte Level tokenizer, we are able to train a large-transformer model, Ro BERTa, on a large corpus of 10,000,000 SMILES strings from a commonly known benchmark chemistry dataset, Pub Chem. Training Ro BERTa over 10 epochs, the model achieves a pretty good loss of 0. 198, and may likely continue to converge if trained for a larger number of epochs. The model can predict masked/corrupted tokens within a SMILES sequence/molecule, allowing for variants of a molecule within discoverable chemical space to be predicted. By applying the representations of functional groups and atoms learned by the model, we can try to tackle problems of toxicity, solubility, drug-likeness, and synthesis accessibility on smaller datasets using the learned representations as features for graph convolution and attention models on the graph structure of molecules, as well as fine-tuning of BERT. Finally, we propose the use of attention visualization as a helpful tool for chemistry practitioners and students to quickly identify important substructures in various chemical properties. Additionally, visualization of the attention mechanism have been seen through previous research as incredibly valuable towards chemical reaction classification. The applications of open-sourcing large-scale transformer models such as Ro BERTa with Hugging Face may allow for the acceleration of these individual research directions. A link to a repository which includes the training, uploading and evaluation notebook (with sample predictions on compounds such as Remdesivir) can be found here. All of the notebooks can be copied into a new Colab runtime for easy execution. This repository will be updated with new features, such as attention visualization, easier benchmarking infrastructure, and more. The work behind this tutorial has been published on Arxiv, and was accepted for a poster presentation at Neur IPS 2020's ML for Molecules Workshop. For the sake of this tutorial, we'll be fine-tuning a pre-trained Chem BERTa on a small-scale molecule dataset, Clintox, to show the potiential and effectiveness of Hugging Face's NLP-based transfer learning applied to computational chemistry. Output for some cells are purposely cleared for readability, so do not worry if some output messages for your cells differ! In short, there are three major components we'll be going over in this notebook. 1. Masked token inference predictions on SMILES strings 2. Attention visualizaiton of the Pub Chem-10M model 3. Fine-tuninhg BPE-Chem BERTa and Smiles-Tokenizer Chem BERTa model's on the CLintox toxicity dataset. Don't worry if you aren't familiar with some of these terms. We will explain them later in the tutorial! If you're looking to dive deeper, check out the poster here. Colab This tutorial and the rest in this sequence are designed to be done in Google colab. If you'd like to open this notebook in colab, you can use the following link. O p e n i n C o l a b O p e n i n C o l a b Setup To run Deep Chem within Colab, you'll need to run the following cell of installation commands. This will take about 5 | deepchem.pdf |
minutes to run to completion and install your environment. ! curl -Lo conda_installer. py https://raw. githubusercontent. com/deepchem/deepchem/master/scripts/colab_install. py import conda_installer conda_installer. install () ! /root/miniconda/bin/conda info -e % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 3501 100 3501 0 0 16995 0 --:--:-- --:--:-- --:--:-- 16995 add /root/miniconda/lib/python3. 7/site-packages to PYTHONPATH python version: 3. 7. 10 remove current miniconda fetching installer from https://repo. continuum. io/miniconda/Miniconda3-latest-Linux-x86_64. sh done installing miniconda to /root/miniconda done installing rdkit, openmm, pdbfixer added omnia to channels added conda-forge to channels done conda packages installation finished! # conda environments: # base * /root/miniconda ! pip install --pre deepchem import deepchem deepchem. __version__ Requirement already satisfied: deepchem in /usr/local/lib/python3. 7/dist-packages (2. 5. 0) Requirement already satisfied: pandas in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 1. 5) Requirement already satisfied: scipy in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 4. 1) Requirement already satisfied: scikit-learn in /usr/local/lib/python3. 7/dist-packages (from deepchem) (0. 22. 2. po st1) Requirement already satisfied: numpy in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 19. 5) Requirement already satisfied: joblib in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 0. 1) Requirement already satisfied: python-dateutil>=2. 7. 3 in /usr/local/lib/python3. 7/dist-packages (from pandas->de epchem) (2. 8. 1) Requirement already satisfied: pytz>=2017. 2 in /usr/local/lib/python3. 7/dist-packages (from pandas->deepchem) (2 018. 9) Requirement already satisfied: six>=1. 5 in /usr/local/lib/python3. 7/dist-packages (from python-dateutil>=2. 7. 3-> pandas->deepchem) (1. 15. 0) wandb : WARNING W&B installed but not logged in. Run `wandb login` or set the WANDB_API_KEY env variable. wandb : WARNING W&B installed but not logged in. Run `wandb login` or set the WANDB_API_KEY env variable. '2. 5. 0' from rdkit import Chem We want to install NVIDIA's Apex tool, for the training pipeline used by simple-transformers and Weights and Biases. This package enables us to use 16-bit training, mixed precision, and distributed training without any changes to our code. Generally GPUs are good at doing 32-bit(single precision) math, not at 16-bit(half) nor 64-bit(double precision). Therefore traditionally deep learning model trainings are done in 32-bit. By switching to 16-bit, we'll be using half the memory and theoretically less computation at the expense of the available number range and precision. However, pure 16-bit training creates a lot of problems for us (imprecise weight updates, gradient underflow and overflow). Mixed precision training, with Apex, alleviates these problems. We will be installing simple-transformers, a library which builds ontop of Hugging Face's transformers package specifically for fine-tuning Chem BERTa. ! git clone https://github. com/NVIDIA/apex ! cd /content/apex ! pip install -v --no-cache-dir /content/apex ! pip install transformers ! pip install simpletransformers ! pip install wandb ! cd .. import sys ! test -d bertviz_repo && echo "FYI: bertviz_repo directory already exists, to pull latest version uncomment this line: !rm -r bertviz_repo" # !rm -r bertviz_repo # Uncomment if you need a clean pull from repo ! test -d bertviz_repo || git clone https://github. com/jessevig/bertviz bertviz_repo if not 'bertviz_repo' in sys. path : sys. path += [ 'bertviz_repo' ] ! pip install regex FYI: bertviz_repo directory already exists, to pull latest version uncomment this line: !rm -r bertviz_repo Requirement already satisfied: regex in /usr/local/lib/python3. 7/dist-packages (2019. 12. 20) | deepchem.pdf |
We're going to clone an auxillary repository, bert-loves-chemistry, which will enable us to use the Mol Net dataloader for Chem BERTa, which automatically generates scaffold splits on any Molecule Net dataset! ! git clone https://github. com/seyonechithrananda/bert-loves-chemistry. git fatal: destination path 'bert-loves-chemistry' already exists and is not an empty directory. ! nvidia-smi Thu Mar 18 14:48:19 2021 +-----------------------------------------------------------------------------+ | NVIDIA-SMI 460. 56 Driver Version: 460. 32. 03 CUDA Version: 11. 2 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp. A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |===============================+======================+======================| | 0 Tesla P100-PCIE... Off | 00000000:00:04. 0 Off | 0 | | N/A 31C P0 26W / 250W | 0Mi B / 16280Mi B | 0% Default | | | | N/A | +-------------------------------+----------------------+----------------------+ +-----------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=============================================================================| | No running processes found | +-----------------------------------------------------------------------------+ Now, to ensure our model demonstrates an understanding of chemical syntax and molecular structure, we'll be testing it on predicting a masked token/character within the SMILES molecule for benzene. # Test if NVIDIA apex training tool works from apex import amp What is a tokenizer? A tokenizer is in charge of preparing the inputs for a natural language processing model. For many scientific applications, it is possible to treat inputs as βwordsβ/βsentencesβ and use NLP methods to make meaningful predictions. For example, SMILES strings or DNA sequences have grammatical structure and can be usefully modeled with NLP techniques. Deep Chem provides some scientifically relevant tokenizers for use in different applications. These tokenizers are based on those from the Huggingface transformers library (which Deep Chem tokenizers inherit from). The base classes Pre Trained Tokenizer and Pre Trained Tokenizer Fast in Hugging Face implements the common methods for encoding string inputs in model inputs and instantiating/saving python tokenizers either from a local file or directory or from a pretrained tokenizer provided by the library (downloaded from Hugging Face's AWS S3 repository). Pre Trained Tokenizer (transformers. Pre Trained Tokenizer) ) thus implements the main methods for using all the tokenizers: Tokenizing (spliting strings in sub-word token strings), converting tokens strings to ids and back, and encoding/decoding (i. e. tokenizing + convert to integers), Adding new tokens to the vocabulary in a way that is independant of the underlying structure (BPE, Sentence Piece... ), Managing special tokens like mask, beginning-of-sentence, etc tokens (adding them, assigning them to attributes in the tokenizer for easy access and making sure they are not split during tokenization) The default tokenizer used by Chem BERTa, is a Byte-Pair-Encoder (BPE). It is a hybrid between character and word-level representations, which allows for the handling of large vocabularies in natural language corpora. Motivated by the intuition that rare and unknown words can often be decomposed into multiple known subwords, BPE finds the best word segmentation by iteratively and greedily merging frequent pairs of characters. First, lets load the model's Byte-Pair Encoding tokenizer, and model, and setup a Huggingface pipeline for masked tokeni prediction. from transformers import Auto Model For Masked LM, Auto Tokenizer, pipeline, Roberta Model, Roberta Tokenizer from bertviz import head_view model = Auto Model For Masked LM. from_pretrained ( "seyonec/Pub Chem10M_SMILES_BPE_450k" ) tokenizer = Auto Tokenizer. from_pretrained ( "seyonec/Pub Chem10M_SMILES_BPE_450k" ) fill_mask = pipeline ( 'fill-mask', model = model, tokenizer = tokenizer ) | deepchem.pdf |
Downloading: 0%| | 0. 00/515 [00:00<?, ?B/s] Downloading: 0%| | 0. 00/336M [00:00<?, ?B/s] Downloading: 0%| | 0. 00/165k [00:00<?, ?B/s] Downloading: 0%| | 0. 00/101k [00:00<?, ?B/s] Downloading: 0%| | 0. 00/772 [00:00<?, ?B/s] Downloading: 0%| | 0. 00/62. 0 [00:00<?, ?B/s] What is a transformer model? Previously, we spoke about the attention mechanism in modern deep learning models. Attention is a concept that helped improve the performance of neural machine translation applications. The Transformer is a model that uses attention to boost the speed with which these models can be trained. With the emergence of BERT by Google AI in 2018, transformers have quickly shot to the top of emerging deep learning methods, outperforming Neural Machine Translation models such as seq2seq and recurrent neural networks at dozens of tasks. The biggest benefit, however, comes from how The Transformer lends itself to efficient pre-training. Using the same pre-training procedure used by Ro BERTa, a follow-up work of BERT, which masks 15% of the tokens, we mask 15% of the tokens in each SMILES string and assign a maximum sequence length of 256 characters. The model then learns to predict masked tokens consisting of atoms and functional groups, or specific groups of atoms within molecules which have their own characteristic properties. Through this, the model learns the relevant molecular context for transferable tasks, such as property prediction. Chem BERTa employs a bidirectional training context to learn context-aware representations of the Pub Chem 10M dataset, downloadable through Molecule Net for self-supervised pre-training ( link ). Our variant of the BERT transformer uses 12 attention heads and 6 layers, resulting in 72 distinct attention mechanisms. The Transformer was proposed in the paper Attention is All You Need. Now, to ensure our the Chem BERTa model demonstrates an understanding of chemical syntax and molecular structure, we'll be testing it on predicting a masked token/character within the SMILES molecule for benzene. Using the Huggingface pipeline we initialized earlier we can fetch a list of the model's predictions by confidence score: smiles_mask = "C1=CC=CC<mask>C1" smiles = "C1=CC=CC=C1" masked_smi = fill_mask ( smiles_mask ) for smi in masked_smi : print ( smi ) {'sequence': 'C1=CC=CC=C1', 'score': 0. 9755934476852417, 'token': 33, 'token_str': '='} {'sequence': 'C1=CC=CC#C1', 'score': 0. 020923888310790062, 'token': 7, 'token_str': '#'} {'sequence': 'C1=CC=CC1C1', 'score': 0. 0007658962858840823, 'token': 21, 'token_str': '1'} {'sequence': 'C1=CC=CC2C1', 'score': 0. 0004129768058191985, 'token': 22, 'token_str': '2'} {'sequence': 'C1=CC=CC=[C1', 'score': 0. 00025319133419543505, 'token': 352, 'token_str': '=['} Here, we get some interesting results. The final branch, C1=CC=CC=C1, is a benzene ring. Since its a pretty common molecule, the model is easily able to predict the final double carbon bond with a score of 0. 98. Let's get a list of the top 5 predictions (including the target, Remdesivir), and visualize them (with a highlighted focus on the beginning of the final benzene-like pattern). To visualize them, we'll be using the RDKit cheminoformatics package we installed earlier, specifically the rdkit. chem. Draw module. import torch import rdkit import rdkit. Chem as Chem from rdkit. Chem import rd FMCS from matplotlib import colors from rdkit. Chem import Draw from rdkit. Chem. Draw import Mol To Image from PIL import Image def get_mol ( smiles ): mol = Chem. Mol From Smiles ( smiles ) if mol is None : return None Chem. Kekulize ( mol ) return mol def find_matches_one ( mol, submol ): #find all matching atoms for each submol in submol_list in mol. match_dict = {} | deepchem.pdf |
mols = [ mol, submol ] #pairwise search res = rd FMCS. Find MCS ( mols ) #,ring Matches Ring Only=True) mcsp = Chem. Mol From Smarts ( res. smarts String ) matches = mol. Get Substruct Matches ( mcsp ) return matches #Draw the molecule def get_image ( mol, atomset ): hcolor = colors. to_rgb ( 'green' ) if atomset is not None : #highlight the atoms set while drawing the whole molecule. img = Mol To Image ( mol, size = ( 600, 600 ), fit Image = True, highlight Atoms = atomset, highlight Color = hcolor ) else : img = Mol To Image ( mol, size = ( 400, 400 ), fit Image = True ) return img sequence = f "C1=CC=CC= { tokenizer. mask_token } 1" substructure = "CC=CC" image_list = [] input = tokenizer. encode ( sequence, return_tensors = "pt" ) mask_token_index = torch. where ( input == tokenizer. mask_token_id )[ 1 ] token_logits = model ( input )[ 0 ] mask_token_logits = token_logits [ 0, mask_token_index, :] top_5_tokens = torch. topk ( mask_token_logits, 5, dim = 1 ). indices [ 0 ]. tolist () for token in top_5_tokens : smi = ( sequence. replace ( tokenizer. mask_token, tokenizer. decode ([ token ]))) print ( smi ) smi_mol = get_mol ( smi ) substructure_mol = get_mol ( substructure ) if smi_mol is None : # if the model's token prediction isn't chemically feasible continue Draw. Mol To File ( smi_mol, smi + ". png" ) matches = find_matches_one ( smi_mol, substructure_mol ) atomset = list ( matches [ 0 ]) img = get_image ( smi_mol, atomset ) img. format = "PNG" image_list. append ( img ) C1=CC=CC=CC1 C1=CC=CC=CCC1 C1=CC=CC=CN1 C1=CC=CC=CCCC1 C1=CC=CC=CCO1 from IPython. display import Image for img in image_list : display ( img ) | deepchem.pdf |
deepchem.pdf |
|
deepchem.pdf |
|
As we can see above, 5 out of 5 of the model's MLM predictions are chemically valid. Overall, the model seems to understand syntax with a pretty decent degree of certainity. However, further training on a more specific dataset (say leads for a specific target) may generate a stronger chemical transformer model. Let's now fine-tune our model on a dataset of our choice, Clin Tox. You can run Chem BERTa on any Molecule Net dataset, but for the sake of convinience, we will use Clin Tox as it is small and trains quickly. What is attention? Previously, recurrent models struggled with generating a fixed-length vector for large sequences, leading to deteriorating performance as the length of an input sequence increased. Attention is, to some extent, motivated by how we pay visual attention to different regions of our vision or how we correlate words in a sentence. Human visual attention allows us to focus on a certain subregion with a higher focus while perceiving the surrounding image in with a lower focus, and then adjust the focal point. Similarly, we can explain the relationship between words in one sentence or close context. When we see βeatingβ, we expect to read a food word very soon. The color term describes the food, but probably not as directly as βeatingβ does: The attention mechanism extends on the encoder-decoder model, by taking in three values for a SMILES sequence: a value vector (V), a query vector (Q) and a key vector (K). Each vector is similar to a type of word embedding, specifically for determining the compatibility of neighbouring tokens. From these vectors, a dot production attention is derived from the dot product of the query vector of one word, and the key vector of the other. | deepchem.pdf |
A scaling factor of is added to the dot product attention such that the value doesn't grow too large in respect to, the dimension of the key. The softmax normalization function is applied to return a score between 0 to 1 for each individual token: Visualizing the Attention Mechanism in Chem BERTa using Bert Viz Bert Viz is a tool for visualizing attention in the Transformer model, supporting all models from the transformers library (BERT, GPT-2, XLNet, Ro BERTa, XLM, CTRL, etc. ). It extends the Tensor2Tensor visualization tool by Llion Jones and the transformers library from Hugging Face. Using this tool, we can easily plug in Chem BERTa from the Hugging Face model hub and visualize the attention patterns produced by one or more attention heads in a given transformer layer. This is known as the attention-head view. Lets start by obtaining a Javascript object for d3. js and jquery to create interactive visualizations: %% javascript require. config ({ paths : { d3 : '//cdnjs. cloudflare. com/ajax/libs/d3/3. 4. 8/d3. min', jquery : '//ajax. googleapis. com/ajax/libs/jquery/2. 0. 0/jquery. min', } }); def call_html (): import IPython display ( IPython. core. display. HTML ( ''' <script src="/static/components/requirejs/require. js"></script> <script> requirejs. config({ paths: { base: '/static/base', "d3": "https://cdnjs. cloudflare. com/ajax/libs/d3/3. 5. 8/d3. min", jquery: '//ajax. googleapis. com/ajax/libs/jquery/2. 0. 0/jquery. min', }, }); </script> ''' )) Now, we create an instance of Chem BERTa, tokenize a set of SMILES strings, and compute the attention for each head in the transformer. There are two available models hosted by Deep Chem on Hugging Face's model hub, one being seyonec/Chem BERTa-zinc-base-v1 which is the Chem BERTa model trained via masked lagnuage modelling (MLM) on the ZINC100k dataset, and the other being seyonec/Chem BERTa-zinc250k-v1, which is trained via MLM on the larger ZINC250k dataset. In the following example, we take two SMILES molecules from the ZINC database with nearly identical chemical structure, the only difference being rooted in chiral specification (hence the additional '@' symbol). This is a feature of molecules which indicates that there exists tetrahedral centres. '@' tells us whether the neighbours of a molecule appear in a counter-clockwise order, whereas '@@' indicates that the neighbours are ordered in a clockwise direction. The model should ideally refer to similar substructures in each SMILES string with a higher attention weightage. Lets look at the first SMILES string: CCCCC[C@@H](Br)CC : m = Chem. Mol From Smiles ( 'CCCCC[C@@H](Br)CC' ) fig = Draw. Mol To MPL ( m, size = ( 200, 200 )) [ Math Processing Error ] | deepchem.pdf |
And the second SMILES string, CCCCC[C@H](Br)CC : m = Chem. Mol From Smiles ( 'CCCCC[C@H](Br)CC' ) fig = Draw. Mol To MPL ( m, size = ( 200, 200 )) The visualization below shows the attention induced by a sample input SMILES. This view visualizes attention as lines connecting the tokens being updated (left) with the tokens being attended to (right), following the design of the figures above. Color intensity reflects the attention weight; weights close to one show as very dark lines, while weights close to zero appear as faint lines or are not visible at all. The user may highlight a particular SMILES character to see the attention from that token only. This visualization is called the attention-head view. It is based on the excellent Tensor2Tensor visualization tool, and are all generated by the Bertviz library. from transformers import Roberta Model, Roberta Tokenizer from bertviz import head_view model_version = 'seyonec/Pub Chem10M_SMILES_BPE_450k' model = Roberta Model. from_pretrained ( model_version, output_attentions = True ) tokenizer = Roberta Tokenizer. from_pretrained ( model_version ) sentence_a = "CCCCC[C@@H](Br)CC" sentence_b = "CCCCC[C@H](Br)CC" inputs = tokenizer. encode_plus ( sentence_a, sentence_b, return_tensors = 'pt', add_special_tokens = True ) input_ids = inputs [ 'input_ids' ] attention = model ( input_ids )[-1 ] input_id_list = input_ids [ 0 ]. tolist () # Batch index 0 tokens = tokenizer. convert_ids_to_tokens ( input_id_list ) call_html () head_view ( attention, tokens ) Layer: | deepchem.pdf |
Smiles-Tokenizer Attention by Head View The visualization shows that attention is highest between words that don't cross a boundary between the two SMILES strings; the model seems to understand that it should relate tokens to other tokens in the same molecule in order to best understand their context. There are many other fascinating visualizations we can do, such as a neuron-by neuron analysis of attention or a model overview that visualizes all of the heads at once: Attention by Head View: Model View: | deepchem.pdf |
Neuron-by-neuron view: | deepchem.pdf |
You can try out the Chem BERTa attention visualization demo's in more detail, with custom SMILES/SELFIES strings, tokenizers, and more in the public library, here. What is Transfer Learning, and how does Chem BERTa utilize it? Transfer learning is a research problem in machine learning that focuses on storing knowledge gained while solving one problem and applying it to a different but related problem. By pre-training directly on SMILES strings, and teaching Chem BERTa to recognize masked tokens in each string, the model learns a strong molecular representation. We then can take this model, trained on a structural chemistry task, and apply it to a suite of classification tasks in the Molecule Net suite, from Tox21 to BBBP! Fine-tuning Chem BERTa on a Small Mollecular Dataset Our fine-tuning dataset, Clin Tox, consists of qualitative data of drugs approved by the FDA and those that have failed clinical trials for toxicity reasons. The Clin Tox dataset consists of 1478 binary labels for toxicity, using the SMILES representations for identifying molecules. The computational models produced from the dataset could become decision-making tools for government agencies in determining which drugs are of the greatest potential concern to human health. Additionally, these models can act as drug screening tools in the drug discovery pipelines for toxicity. Let's start by importing the Mol Net dataloder from bert-loves-chemistry, before importing apex and transformers, the tool which will allow us to import the Chem BERTA language model (LM) trained on Pub Chem-10M. % cd /content/bert-loves-chemistry /content/bert-loves-chemistry ! pwd /content/bert-loves-chemistry import os import numpy as np import pandas as pd from typing import List | deepchem.pdf |
# import molnet loaders from deepchem from deepchem. molnet import load_bbbp, load_clearance, load_clintox, load_delaney, load_hiv, load_qm7, load_tox21 from rdkit import Chem # import Mol Net dataloder from bert-loves-chemistry fork from chemberta. utils. molnet_dataloader import load_molnet_dataset, write_molnet_dataset_for_chemprop But why use custom Smiles-Tokenizer's over BPE? In this tutorial, we will be comparing the BPE tokenization algorithm with a custom Smiles Tokenizer based on a regex pattern, which we have released as part of Deep Chem. To compare tokenizers, we pretrained an identical model tokenized using this novel tokenizer, on the Pub Chem-1M set. The pretrained model was evaluated on the BBBP and Tox21 in the paper. We found that the Smiles Tokenizer narrowly outperformed the BPE algorithm by βPRC-AUC =. Though this result suggests that a more semantically relevant tokenization may provide performance benefits, further benchmarking on additional datasets is needed to validate this finding. In this tutorial, we aim to do so, by testing this alternate model on the Clin Tox dataset. Let's fetch the Smiles Tokenizer's character per line vocabulary file, which can bve loaded from the Deep Chem S3 data bucket: ! wget https://deepchemdata. s3-us-west-1. amazonaws. com/datasets/vocab. txt--2021-03-18 14:48:45-- https://deepchemdata. s3-us-west-1. amazonaws. com/datasets/vocab. txt Resolving deepchemdata. s3-us-west-1. amazonaws. com (deepchemdata. s3-us-west-1. amazonaws. com)... 52. 219. 113. 41 Connecting to deepchemdata. s3-us-west-1. amazonaws. com (deepchemdata. s3-us-west-1. amazonaws. com)|52. 219. 113. 41|:4 43... connected. HTTP request sent, awaiting response... 200 OK Length: 3524 (3. 4K) [text/plain] Saving to: 'vocab. txt' vocab. txt 100%[===================>] 3. 44K --.-KB/s in 0s 2021-03-18 14:48:46 (62. 2 MB/s) - 'vocab. txt' saved [3524/3524] Lets use the Mol Net dataloader to generate scaffold splits from the Clin Tox dataset. tasks, ( train_df, valid_df, test_df ), transformers = load_molnet_dataset ( "clintox", tasks_wanted = None ) | deepchem.pdf |
'split' is deprecated. Use 'splitter' instead. Failed to featurize datapoint 7, None. Appending empty array Exception message: Python argument types in rdkit. Chem. rdmolfiles. Canonical Rank Atoms(None Type) did not match C++ signature: Canonical Rank Atoms(RDKit::ROMol mol, bool break Ties=True, bool include Chirality=True, bool include Isotopes=T rue) Failed to featurize datapoint 302, None. Appending empty array Exception message: Python argument types in rdkit. Chem. rdmolfiles. Canonical Rank Atoms(None Type) did not match C++ signature: Canonical Rank Atoms(RDKit::ROMol mol, bool break Ties=True, bool include Chirality=True, bool include Isotopes=T rue) Failed to featurize datapoint 983, None. Appending empty array Exception message: Python argument types in rdkit. Chem. rdmolfiles. Canonical Rank Atoms(None Type) did not match C++ signature: Canonical Rank Atoms(RDKit::ROMol mol, bool break Ties=True, bool include Chirality=True, bool include Isotopes=T rue) Failed to featurize datapoint 984, None. Appending empty array Exception message: Python argument types in rdkit. Chem. rdmolfiles. Canonical Rank Atoms(None Type) did not match C++ signature: Canonical Rank Atoms(RDKit::ROMol mol, bool break Ties=True, bool include Chirality=True, bool include Isotopes=T rue) Failed to featurize datapoint 1219, None. Appending empty array Exception message: Python argument types in rdkit. Chem. rdmolfiles. Canonical Rank Atoms(None Type) did not match C++ signature: Canonical Rank Atoms(RDKit::ROMol mol, bool break Ties=True, bool include Chirality=True, bool include Isotopes=T rue) Failed to featurize datapoint 1220, None. Appending empty array Exception message: Python argument types in rdkit. Chem. rdmolfiles. Canonical Rank Atoms(None Type) did not match C++ signature: Canonical Rank Atoms(RDKit::ROMol mol, bool break Ties=True, bool include Chirality=True, bool include Isotopes=T rue) /usr/local/lib/python3. 7/dist-packages/numpy/core/_asarray. py:83: Visible Deprecation Warning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray return array(a, dtype, copy=False, order=order) Using tasks ['CT_TOX'] from available tasks for clintox: ['FDA_APPROVED', 'CT_TOX'] If you're only running the toxicity prediction portion of this tutorial, make sure you install transformers here. If you've ran all the cells before, you can ignore this install as we've already done pip install transformers before. ! pip install transformers Requirement already satisfied: transformers in /usr/local/lib/python3. 7/dist-packages (4. 4. 1) Requirement already satisfied: tqdm>=4. 27 in /usr/local/lib/python3. 7/dist-packages (from transformers) (4. 59. 0) Requirement already satisfied: importlib-metadata; python_version < "3. 8" in /usr/local/lib/python3. 7/dist-packa ges (from transformers) (3. 7. 2) Requirement already satisfied: regex!=2019. 12. 17 in /usr/local/lib/python3. 7/dist-packages (from transformers) ( 2019. 12. 20) Requirement already satisfied: numpy>=1. 17 in /usr/local/lib/python3. 7/dist-packages (from transformers) (1. 19. 5 ) Requirement already satisfied: requests in /usr/local/lib/python3. 7/dist-packages (from transformers) (2. 23. 0) Requirement already satisfied: tokenizers<0. 11,>=0. 10. 1 in /usr/local/lib/python3. 7/dist-packages (from transfor mers) (0. 10. 1) Requirement already satisfied: sacremoses in /usr/local/lib/python3. 7/dist-packages (from transformers) (0. 0. 43) Requirement already satisfied: filelock in /usr/local/lib/python3. 7/dist-packages (from transformers) (3. 0. 12) Requirement already satisfied: packaging in /usr/local/lib/python3. 7/dist-packages (from transformers) (20. 9) Requirement already satisfied: zipp>=0. 5 in /usr/local/lib/python3. 7/dist-packages (from importlib-metadata; pyt hon_version < "3. 8"->transformers) (3. 4. 1) Requirement already satisfied: typing-extensions>=3. 6. 4; python_version < "3. 8" in /usr/local/lib/python3. 7/dist-packages (from importlib-metadata; python_version < "3. 8"->transformers) (3. 7. 4. 3) Requirement already satisfied: chardet<4,>=3. 0. 2 in /usr/local/lib/python3. 7/dist-packages (from requests->trans formers) (3. 0. 4) Requirement already satisfied: idna<3,>=2. 5 in /usr/local/lib/python3. 7/dist-packages (from requests->transforme rs) (2. 10) Requirement already satisfied: urllib3!=1. 25. 0,!=1. 25. 1,<1. 26,>=1. 21. 1 in /usr/local/lib/python3. 7/dist-packages (from requests->transformers) (1. 24. 3) Requirement already satisfied: certifi>=2017. 4. 17 in /usr/local/lib/python3. 7/dist-packages (from requests->tran sformers) (2020. 12. 5) Requirement already satisfied: click in /usr/local/lib/python3. 7/dist-packages (from sacremoses->transformers) ( 7. 1. 2) Requirement already satisfied: six in /usr/local/lib/python3. 7/dist-packages (from sacremoses->transformers) (1. 15. 0) Requirement already satisfied: joblib in /usr/local/lib/python3. 7/dist-packages (from sacremoses->transformers) (1. 0. 1) Requirement already satisfied: pyparsing>=2. 0. 2 in /usr/local/lib/python3. 7/dist-packages (from packaging->trans formers) (2. 4. 7) | deepchem.pdf |
train_df text labels 0 CC(C)C[C@H](NC(=O)CNC(=O)c1cc(Cl)ccc1Cl)B(O)O 0 1 O=C(NCC(O)CO)c1c(I)c(C(=O)NCC(O)CO)c(I)c(N(CCO... 1 2 Clc1cc(Cl)c(OCC#CI)cc1Cl 1 3 N#Cc1cc(NC(=O)C(=O)[O-])c(Cl)c(NC(=O)C(=O)[O-])c1 1 4 NS(=O)(=O)c1cc(Cl)c(Cl)c(S(N)(=O)=O)c1 1......... 1177 CC(C[NH2+]C1CCCCC1)OC(=O)c1ccccc1 1 1178 CC(C(=O)[O-])c1ccc(C(=O)c2cccs2)cc1 1 1179 CC(c1cc2ccccc2s1)N(O)C(N)=O 1 1180 CC(O)C(CO)NC(=O)C1CSSCC(NC(=O)C([NH3+])Cc2cccc... 1 1181 CC(C)OC(=O)CCC/C=C\C[C@H]1[C@@H](O)C[C@@H](O)[... 1 1182 rows Γ 2 columns valid_df text labels 0 CC(C)OC(=O)CCC/C=C\C[C@H]1[C@@H](O)C[C@@H](O)[... 1 1 CC(C)Nc1cccnc1N1CCN(C(=O)c2cc3cc(NS(C)(=O)=O)c... 1 2 CC(C)n1c(/C=C/[C@H](O)C[C@H](O)CC(=O)[O-])c(-c... 1 3 CC(C)COCC(CN(Cc1ccccc1)c1ccccc1)[NH+]1CCCC1 1 4 CSCC[C@H](NC(=O)[C@H](Cc1c[n H]c2ccccc12)NC(=O)... 1......... 143 C[C@H](OC(=O)c1ccccc1)C1=CCC23OCC[NH+](C)CC12C... 1 144 C[C@@H](c1ncncc1F)[C@](O)(Cn1cncn1)c1ccc(F)cc1F 1 145 CC(C)C[C@@H](NC(=O)[C@H](C)NC(=O)CNC(=O)[C@@H]... 1 146 C[C@H](O)[C@H](O)[C@H]1CNc2[n H]c(N)nc(=O)c2N1 1 147 C[NH+]1C[C@H](C(=O)N[C@]2(C)O[C@@]3(O)[C@@H]4C... 1 148 rows Γ 2 columns test_df text labels 0 C[NH+]1C[C@H](C(=O)N[C@]2(C)O[C@@]3(O)[C@@H]4C... 1 1 C[C@]1(Cn2ccnn2)[C@H](C(=O)[O-])N2C(=O)C[C@H]2... 1 2 C[NH+]1CCC[C@@H]1CCO[C@](C)(c1ccccc1)c1ccc(Cl)cc1 1 3 Nc1nc(NC2CC2)c2ncn([C@H]3C=C[C@@H](CO)C3)c2n1 1 4 OC[C@H]1O[C@@H](n2cnc3c2NC=[NH+]C[C@H]3O)C[C@@... 1......... 143 O=C1O[C@H]([C@@H](O)CO)C([O-])=C1O 1 144 C#CCC(Cc1cnc2nc(N)nc(N)c2n1)c1ccc(C(=O)N[C@@H]... 1 145 C#CC[NH2+][C@@H]1CCc2ccccc21 1 146 [H]/[NH+]=C(\N)c1ccc(OCCCCCOc2ccc(/C(N)=[NH+]/... 1 147 [H]/[NH+]=C(\N)C1=CC(=O)/C(=C\C=c2ccc(=C(N)[NH... 1 148 rows Γ 2 columns From here, lets set up a logger to record if any issues occur, and notify us if there are any problems with the arguments we've set for the model. | deepchem.pdf |
from simpletransformers. classification import Classification Model import logging logging. basic Config ( level = logging. INFO ) transformers_logger = logging. get Logger ( "transformers" ) transformers_logger. set Level ( logging. WARNING ) Now, using simple-transformer, let's load the pre-trained model from Hugging Face's useful model-hub. We'll set the number of epochs to 10 in the arguments, but you can train for longer, and pass early-stopping as an argument to prevent overfitting. Also make sure that auto_weights is set to True to do automatic weight balancing, as we are dealing with imbalanced toxicity datasets. from simpletransformers. classification import Classification Model, Classification Args model = Classification Model ( 'roberta', 'seyonec/Pub Chem10M_SMILES_BPE_396_250', args = { 'evaluate_each_epoch' : True, INFO:filelock:Lock 139908324261648 acquired on /root/. cache/huggingface/transformers/fac1cb3c26e15ed0ea455cf8111 5189edfd28b0cfa0ad7dca9922b8319475530. 6662bce220e70bb69e1cc10c236b68e778001c010a6880b624c2159a235be52d. lock Downloading: 0%| | 0. 00/515 [00:00<?, ?B/s] INFO:filelock:Lock 139908324261648 released on /root/. cache/huggingface/transformers/fac1cb3c26e15ed0ea455cf8111 5189edfd28b0cfa0ad7dca9922b8319475530. 6662bce220e70bb69e1cc10c236b68e778001c010a6880b624c2159a235be52d. lock INFO:filelock:Lock 139908246375248 acquired on /root/. cache/huggingface/transformers/fca63b78d86d5e1ceec66e1d9f3 ff8ec0d078055e0ba387926cf9baf6b86ce79. 93843c462ba2f6d2fecf01338be4b448f0b6f8f7dfed6535b7ffbd3e4203f223. lock Downloading: 0%| | 0. 00/336M [00:00<?, ?B/s] INFO:filelock:Lock 139908246375248 released on /root/. cache/huggingface/transformers/fca63b78d86d5e1ceec66e1d9f3 ff8ec0d078055e0ba387926cf9baf6b86ce79. 93843c462ba2f6d2fecf01338be4b448f0b6f8f7dfed6535b7ffbd3e4203f223. lock Some weights of the model checkpoint at seyonec/Pub Chem10M_SMILES_BPE_396_250 were not used when initializing Ro berta For Sequence Classification: ['lm_head. bias', 'lm_head. dense. weight', 'lm_head. dense. bias', 'lm_head. layer_no rm. weight', 'lm_head. layer_norm. bias', 'lm_head. decoder. weight', 'lm_head. decoder. bias']- This IS expected if you are initializing Roberta For Sequence Classification from the checkpoint of a model train ed on another task or with another architecture (e. g. initializing a Bert For Sequence Classification model from a Bert For Pre Training model).- This IS NOT expected if you are initializing Roberta For Sequence Classification from the checkpoint of a model t hat you expect to be exactly identical (initializing a Bert For Sequence Classification model from a Bert For Sequenc e Classification model). Some weights of Roberta For Sequence Classification were not initialized from the model checkpoint at seyonec/Pub Ch em10M_SMILES_BPE_396_250 and are newly initialized: ['classifier. dense. weight', 'classifier. dense. bias', 'classi fier. out_proj. weight', 'classifier. out_proj. bias'] You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference. INFO:filelock:Lock 139908246200848 acquired on /root/. cache/huggingface/transformers/3df58ba3fcca472da48db1fb3a6 69ebed9808cae886e8f7b99e6aed197a808cb. 98d8cf992f31bc68994648ce3120c3cb14bf75e4e60a70a06cc61cce44b902f0. lock Downloading: 0%| | 0. 00/165k [00:00<?, ?B/s] INFO:filelock:Lock 139908246200848 released on /root/. cache/huggingface/transformers/3df58ba3fcca472da48db1fb3a6 69ebed9808cae886e8f7b99e6aed197a808cb. 98d8cf992f31bc68994648ce3120c3cb14bf75e4e60a70a06cc61cce44b902f0. lock INFO:filelock:Lock 139908246331344 acquired on /root/. cache/huggingface/transformers/3aa7993e4d850d3abfc9b05959c 762d864591d3d5d450310e9ccb1ef0b2c339c. 07b622242dcc6c7fd6a5f356d2e200c4f44be0279b767b85afcb24e778809d3c. lock Downloading: 0%| | 0. 00/101k [00:00<?, ?B/s] INFO:filelock:Lock 139908246331344 released on /root/. cache/huggingface/transformers/3aa7993e4d850d3abfc9b05959c 762d864591d3d5d450310e9ccb1ef0b2c339c. 07b622242dcc6c7fd6a5f356d2e200c4f44be0279b767b85afcb24e778809d3c. lock INFO:filelock:Lock 139908246331344 acquired on /root/. cache/huggingface/transformers/4b86306d6b8b22c548d737ae242 68401236ad7a42564ddb028bf193f485c55f2. cb2244924ab24d706b02fd7fcedaea4531566537687a539ebb94db511fd122a0. lock Downloading: 0%| | 0. 00/772 [00:00<?, ?B/s] INFO:filelock:Lock 139908246331344 released on /root/. cache/huggingface/transformers/4b86306d6b8b22c548d737ae242 68401236ad7a42564ddb028bf193f485c55f2. cb2244924ab24d706b02fd7fcedaea4531566537687a539ebb94db511fd122a0. lock INFO:filelock:Lock 139908112777168 acquired on /root/. cache/huggingface/transformers/10b820db140011d86e29dad69ed 31c58db810e5f85a13982ec9457a63da1bb17. 1788df22ba1a6817edb607a56efa931ee13ebad3b3500e58029a8f4e6d799a29. lock Downloading: 0%| | 0. 00/62. 0 [00:00<?, ?B/s] INFO:filelock:Lock 139908112777168 released on /root/. cache/huggingface/transformers/10b820db140011d86e29dad69ed 31c58db810e5f85a13982ec9457a63da1bb17. 1788df22ba1a6817edb607a56efa931ee13ebad3b3500e58029a8f4e6d799a29. lock print ( model. tokenizer ) Pre Trained Tokenizer(name_or_path='seyonec/Pub Chem10M_SMILES_BPE_396_250', vocab_size=7924, model_max_len=1000000 000000000019884624838656, is_fast=False, padding_side='right', special_tokens={'bos_token': Added Token("<s>", rs trip=False, lstrip=False, single_word=False, normalized=True), 'eos_token': Added Token("</s>", rstrip=False, lst rip=False, single_word=False, normalized=True), 'unk_token': Added Token("<unk>", rstrip=False, lstrip=False, sin gle_word=False, normalized=True), 'sep_token': Added Token("</s>", rstrip=False, lstrip=False, single_word=False, normalized=True), 'pad_token': Added Token("<pad>", rstrip=False, lstrip=False, single_word=False, normalized=Tru e), 'cls_token': Added Token("<s>", rstrip=False, lstrip=False, single_word=False, normalized=True), 'mask_token' : Added Token("<mask>", rstrip=False, lstrip=True, single_word=False, normalized=True)}) # check if our train and evaluation dataframes are setup properly. There should only be two columns for the SMILES string and its corresponding label. print ( "Train Dataset: {} ". format ( train_df. shape )) print ( "Eval Dataset: {} ". format ( valid_df. shape )) print ( "TEST Dataset: {} ". format ( test_df. shape )) Train Dataset: (1182, 2) Eval Dataset: (148, 2) TEST Dataset: (148, 2) Now that we've set everything up, lets get to the fun part: training the model! We use Weights and Biases, which is optional (simply remove wandb_project from the list of args ). Its a really useful tool for monitering the model's | deepchem.pdf |
training results (such as accuracy, learning rate and loss), alongside custom visualizations of attention and gradients. When you run this cell, Weights and Biases will ask for an account, which you can setup through a Github account, giving you an authorization API key which you can paste into the output of the cell. Again, this is completely optional and it can be removed from the list of arguments. ! wandb login wandb : You can find your API key in your browser here: https://wandb. ai/authorize wandb : Paste an API key from your profile and hit enter: wandb : Appending key for api. wandb. ai to your netrc file: /root/. netrc Finally, the moment we've been waiting for! Let's train the model on the train scaffold set of Clin Tox, and monitor our runs using W&B. We will evaluate the performance of our model each epoch using the validation set. # Create directory to store model weights (change path accordingly to where you want!) ! mkdir BPE_Pub Chem_10M_Clin Tox_run # Train the model model. train_model ( train_df, eval_df = valid_df, output_dir = '/content/BPE_Pub Chem_10M_Clin Tox_run', args = { 'wandb_project' INFO:simpletransformers. classification. classification_utils: Converting to features started. Cache is not used. 0%| | 0/2 [00:00<?, ?it/s] INFO:simpletransformers. classification. classification_utils: Saving features into cached file cache_dir/cached_t rain_roberta_128_2_1182 Epoch: 0%| | 0/10 [00:00<?, ?it/s] INFO:simpletransformers. classification. classification_model: Initializing Wand B run for training. wandb : Currently logged in as: seyonec (use `wandb login --relogin` to force relogin) Tracking run with wandb version 0. 10. 22 Syncing run snowy-firefly-250 to Weights & Biases (Documentation). Project page: https://wandb. ai/seyonec/project-name Run page: https://wandb. ai/seyonec/project-name/runs/1t7dyfs4 Run data is saved locally in /content/bert-loves-chemistry/wandb/run-20210318_145336-1t7dyfs4 Running Epoch 0 of 10: 0%| | 0/148 [00:00<?, ?it/s] /usr/local/lib/python3. 7/dist-packages/torch/nn/modules/module. py:760: User Warning: Using non-full backward hook s on a Module that does not return a single Tensor or a tuple of Tensors is deprecated and will be removed in fu ture versions. This hook will be missing some of the grad_output. Please use register_full_backward_hook to get the documented behavior. warnings. warn("Using non-full backward hooks on a Module that does not return a " /usr/local/lib/python3. 7/dist-packages/torch/nn/modules/module. py:795: User Warning: Using a non-full backward ho ok when the forward contains multiple autograd Nodes is deprecated and will be removed in future versions. This hook will be missing some grad_input. Please use register_full_backward_hook to get the documented behavior. warnings. warn("Using a non-full backward hook when the forward contains multiple autograd Nodes " Running Epoch 1 of 10: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 2 of 10: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 3 of 10: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 4 of 10: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 5 of 10: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 6 of 10: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 7 of 10: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 8 of 10: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 9 of 10: 0%| | 0/148 [00:00<?, ?it/s] INFO:simpletransformers. classification. classification_model: Training of roberta model complete. Saved to /conte nt/BPE_Pub Chem_10M_Clin Tox_run. (1480, 0. 10153530814545406) Let's install scikit-learn now, to evaluate the model we've trained. We will be using the accuracy and PRC-AUC metrics (average precision score). import sklearn # accuracy result, model_outputs, wrong_predictions = model. eval_model ( test_df, acc = sklearn. metrics. accuracy_score ) # ROC-PRC result, model_outputs, wrong_predictions = model. eval_model ( test_df, acc = sklearn. metrics. average_precision_score ) INFO:simpletransformers. classification. classification_utils: Converting to features started. Cache is not used. INFO:simpletransformers. classification. classification_utils: Saving features into cached file cache_dir/cached_d ev_roberta_128_2_148 Running Evaluation: 0%| | 0/19 [00:00<?, ?it/s] INFO:simpletransformers. classification. classification_model: Initializing Wand B run for evaluation. Finishing last run (ID:1t7dyfs4) before initializing another... | deepchem.pdf |
Waiting for W&B process to finish, PID 4627 Program ended successfully. VBox(children=(Label(value=' 0. 01MB of 0. 01MB uploaded (0. 00MB deduped)\r'), Float Progress(value=1. 0, max=1. 0)... Find user logs for this run at: /content/bert-loves-chemistry/wandb/run-20210318_145336-1t7dyfs4/logs/debug. log Find internal logs for this run at: /content/bert-loves-chemistry/wandb/run-20210318_145336-1t7dyfs4/logs/debug-internal. log Run summary: Training loss 0. 0003 lr 0. 0 global_step 1450 _runtime 116 _timestamp 1616079332 _step 28 Run history: Training loss βββββββββββββββββββββββββββββ lr β
βββββββββββ
β
β
β
ββββββββββββββ global_step βββββββββββββββ
β
β
β
β
ββββββββββ _runtime ββββββββββββββββ
β
β
β
ββββββββββ _timestamp ββββββββββββββββ
β
β
β
ββββββββββ _step βββββββββββββββ
β
β
β
β
ββββββββββ Synced 5 W&B file(s), 1 media file(s), 0 artifact file(s) and 0 other file(s) Synced snowy-firefly-250 : https://wandb. ai/seyonec/project-name/runs/1t7dyfs4... Successfully finished last run (ID:1t7dyfs4). Initializing new run: Tracking run with wandb version 0. 10. 22 Syncing run summer-pyramid-251 to Weights & Biases (Documentation). Project page: https://wandb. ai/seyonec/project-name Run page: https://wandb. ai/seyonec/project-name/runs/a6brkv9i Run data is saved locally in /content/bert-loves-chemistry/wandb/run-20210318_145535-a6brkv9i INFO:simpletransformers. classification. classification_model:{'mcc': 0. 664470436990577, 'tp': 138, 'tn': 5, 'fp': 4, 'fn': 1, 'auroc': 0. 8281374900079936, 'auprc': 0. 9855371861072479, 'acc': 0. 9662162162162162, 'eval_loss': 0. 2469284737426757} INFO:simpletransformers. classification. classification_utils: Converting to features started. Cache is not used. INFO:simpletransformers. classification. classification_utils: Saving features into cached file cache_dir/cached_d ev_roberta_128_2_148 Running Evaluation: 0%| | 0/19 [00:00<?, ?it/s] INFO:simpletransformers. classification. classification_model: Initializing Wand B run for evaluation. Finishing last run (ID:a6brkv9i) before initializing another... Waiting for W&B process to finish, PID 4677 Program ended successfully. VBox(children=(Label(value=' 0. 00MB of 0. 00MB uploaded (0. 00MB deduped)\r'), Float Progress(value=1. 0, max=1. 0)... Find user logs for this run at: /content/bert-loves-chemistry/wandb/run-20210318_145535-a6brkv9i/logs/debug. log Find internal logs for this run at: /content/bert-loves-chemistry/wandb/run-20210318_145535-a6brkv9i/logs/debug-internal. log | deepchem.pdf |
Run summary: _runtime 3 _timestamp 1616079341 _step 2 Run history: _runtime βββ _timestamp βββ _step ββ
β Synced 5 W&B file(s), 3 media file(s), 0 artifact file(s) and 0 other file(s) Synced summer-pyramid-251 : https://wandb. ai/seyonec/project-name/runs/a6brkv9i... Successfully finished last run (ID:a6brkv9i). Initializing new run: Tracking run with wandb version 0. 10. 22 Syncing run vivid-morning-252 to Weights & Biases (Documentation). Project page: https://wandb. ai/seyonec/project-name Run page: https://wandb. ai/seyonec/project-name/runs/7bl6wyef Run data is saved locally in /content/bert-loves-chemistry/wandb/run-20210318_145541-7bl6wyef INFO:simpletransformers. classification. classification_model:{'mcc': 0. 664470436990577, 'tp': 138, 'tn': 5, 'fp': 4, 'fn': 1, 'auroc': 0. 8281374900079936, 'auprc': 0. 9855371861072479, 'acc': 0. 9715961528455196, 'eval_loss': 0. 2469284737426757} The model performs pretty well, averaging above 97% ROC-PRC after training on only ~1400 data samples and 150 positive leads in a couple of minutes! We can clearly see the predictive power of transfer learning, and approaches like these are becoming increasing popular in the pharmaceutical industry where larger datasets are scarce. By training on more epochs and tasks, we can probably boost the accuracy as well! Lets evaluate the model on one last string from Clin Tox's test set for toxicity. The model should predict 1, meaning the drug failed clinical trials for toxicity reasons and wasn't approved by the FDA. # Lets input a molecule with a toxicity value of 1 predictions, raw_outputs = model. predict ([ 'C1=C(C(=O)NC(=O)N1)F' ]) INFO:simpletransformers. classification. classification_utils: Converting to features started. Cache is not used. INFO:simpletransformers. classification. classification_utils: Saving features into cached file cache_dir/cached_d ev_roberta_128_2_1 0%| | 0/1 [00:00<?, ?it/s] print ( predictions ) print ( raw_outputs ) [1] [[-4. 51171875 4. 58203125]] The model predicts the sample correctly! Some future tasks may include using the same model on multiple tasks (Tox21 provides multiple tasks relating to different biochemical pathways for toxicity, as an example), through multi-task classification, as well as training on a larger dataset such as HIV, one of the other harder tasks in molecular machine learning. This will be expanded on in future work! Benchmarking Smiles-Tokenizer Chem BERTa models on Clin Tox Now lets compare how this model performs to a similar variant of Chem BERTa, that utilizes a different tokenizer, the Smiles Tokenizer which is built-in to Deep Chem! Let see if using a tokenizer which splits SMILES sequences into syntatically relevant chemical tokens performs differently, especially on molecular property prediction. First off, lets initialize this variant model: from simpletransformers. classification import Classification Model, Classification Args | deepchem.pdf |
model = Classification Model ( 'roberta', 'seyonec/SMILES_tokenized_Pub Chem_shard00_160k', args = { 'evaluate_each_epoch' INFO:filelock:Lock 139908321724944 acquired on /root/. cache/huggingface/transformers/30ac96f427325ec13c51dfd4507 636207bdb9be77521b77ad334279cf1f5c184. f6ebc79ab803ca349ef7b469b0fbe6aa40d053e3c1c2da0501521c46c2a51bb7. lock Downloading: 0%| | 0. 00/515 [00:00<?, ?B/s] INFO:filelock:Lock 139908321724944 released on /root/. cache/huggingface/transformers/30ac96f427325ec13c51dfd4507 636207bdb9be77521b77ad334279cf1f5c184. f6ebc79ab803ca349ef7b469b0fbe6aa40d053e3c1c2da0501521c46c2a51bb7. lock INFO:filelock:Lock 139908321724944 acquired on /root/. cache/huggingface/transformers/3a95725b53b9958c41159cd19bb de8dad8e5988ff0a6971189ef3b6b625e5f5b. ae1cdbb61878f3444ee2c5aa28dfc4577a642a31729bf0b477ccd4d948ad9081. lock Downloading: 0%| | 0. 00/336M [00:00<?, ?B/s] INFO:filelock:Lock 139908321724944 released on /root/. cache/huggingface/transformers/3a95725b53b9958c41159cd19bb de8dad8e5988ff0a6971189ef3b6b625e5f5b. ae1cdbb61878f3444ee2c5aa28dfc4577a642a31729bf0b477ccd4d948ad9081. lock Some weights of the model checkpoint at seyonec/SMILES_tokenized_Pub Chem_shard00_160k were not used when initial izing Roberta For Sequence Classification: ['lm_head. bias', 'lm_head. dense. weight', 'lm_head. dense. bias', 'lm_head. layer_norm. weight', 'lm_head. layer_norm. bias', 'lm_head. decoder. weight', 'lm_head. decoder. bias']- This IS expected if you are initializing Roberta For Sequence Classification from the checkpoint of a model train ed on another task or with another architecture (e. g. initializing a Bert For Sequence Classification model from a Bert For Pre Training model).- This IS NOT expected if you are initializing Roberta For Sequence Classification from the checkpoint of a model t hat you expect to be exactly identical (initializing a Bert For Sequence Classification model from a Bert For Sequenc e Classification model). Some weights of Roberta For Sequence Classification were not initialized from the model checkpoint at seyonec/SMILE S_tokenized_Pub Chem_shard00_160k and are newly initialized: ['classifier. dense. weight', 'classifier. dense. bias', 'classifier. out_proj. weight', 'classifier. out_proj. bias'] You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference. INFO:filelock:Lock 139906415180368 acquired on /root/. cache/huggingface/transformers/b5b8a0f3afd321810f8ab3864fd 3b562ac78b45cffd986e1fe33b0dae85e4149. dcb6a95ce7ba1c00e125887fcabb2ed5074718e901096d78d86a6d720f57db60. lock Downloading: 0%| | 0. 00/8. 14k [00:00<?, ?B/s] INFO:filelock:Lock 139906415180368 released on /root/. cache/huggingface/transformers/b5b8a0f3afd321810f8ab3864fd 3b562ac78b45cffd986e1fe33b0dae85e4149. dcb6a95ce7ba1c00e125887fcabb2ed5074718e901096d78d86a6d720f57db60. lock INFO:filelock:Lock 139908023274512 acquired on /root/. cache/huggingface/transformers/e994bb60d8301b04451980e779a f11e3fd55dfc1a97545f7ed9f25c4bb0144f8. 0d2bc617dafe1551d37a1ee810476c86b8fcb92acede8e1ee6faf97e76000351. lock Downloading: 0. 00B [00:00, ?B/s] INFO:filelock:Lock 139908023274512 released on /root/. cache/huggingface/transformers/e994bb60d8301b04451980e779a f11e3fd55dfc1a97545f7ed9f25c4bb0144f8. 0d2bc617dafe1551d37a1ee810476c86b8fcb92acede8e1ee6faf97e76000351. lock INFO:filelock:Lock 139906414851088 acquired on /root/. cache/huggingface/transformers/186e51d9d044b8d234c30b286f5 8a87c44409db18948a3fd9b40fa795a4b89ad. dd8bd9bfd3664b530ea4e645105f557769387b3da9f79bdb55ed556bdd80611d. lock Downloading: 0%| | 0. 00/112 [00:00<?, ?B/s] INFO:filelock:Lock 139906414851088 released on /root/. cache/huggingface/transformers/186e51d9d044b8d234c30b286f5 8a87c44409db18948a3fd9b40fa795a4b89ad. dd8bd9bfd3664b530ea4e645105f557769387b3da9f79bdb55ed556bdd80611d. lock INFO:filelock:Lock 139906414851152 acquired on /root/. cache/huggingface/transformers/c9175af31705aea512d539a7e6d 96803af809ba0d307eb762cb4b6a1c1af5ced. 444225800184b0dbd3b86bfd798c4195c0af90f2b3b1540552cacd505c3f7c60. lock Downloading: 0%| | 0. 00/327 [00:00<?, ?B/s] INFO:filelock:Lock 139906414851152 released on /root/. cache/huggingface/transformers/c9175af31705aea512d539a7e6d 96803af809ba0d307eb762cb4b6a1c1af5ced. 444225800184b0dbd3b86bfd798c4195c0af90f2b3b1540552cacd505c3f7c60. lock Special tokens have been added in the vocabulary, make sure the associated word embedding are fine-tuned or trai ned. print ( model. tokenizer ) Pre Trained Tokenizer(name_or_path='seyonec/SMILES_tokenized_Pub Chem_shard00_160k', vocab_size=591, model_max_len= 514, is_fast=False, padding_side='right', special_tokens={'bos_token': Added Token("<s>", rstrip=False, lstrip=Fa lse, single_word=False, normalized=True), 'eos_token': Added Token("</s>", rstrip=False, lstrip=False, single_wor d=False, normalized=True), 'unk_token': '[UNK]', 'sep_token': '[SEP]', 'pad_token': '[PAD]', 'cls_token': '[CLS] ', 'mask_token': '[MASK]'}) # check if our train and evaluation dataframes are setup properly. There should only be two columns for the SMILES string and its corresponding label. print ( "Train Dataset: {} ". format ( train_df. shape )) print ( "Eval Dataset: {} ". format ( valid_df. shape )) print ( "TEST Dataset: {} ". format ( test_df. shape )) Train Dataset: (1182, 2) Eval Dataset: (148, 2) TEST Dataset: (148, 2) Now that we've set everything up, lets get to the fun part: training the model! We use Weights and Biases, which is optional (simply remove wandb_project from the list of args ). Its a really useful tool for monitering the model's training results (such as accuracy, learning rate and loss), alongside custom visualizations of attention and gradients. When you run this cell, Weights and Biases will ask for an account, which you can setup through a Github account, giving you an authorization API key which you can paste into the output of the cell. Again, this is completely optional and it can be removed from the list of arguments. ! wandb login wandb : Currently logged in as: seyonec (use `wandb login --relogin` to force relogin) # Create directory to store model weights (change path accordingly to where you want!) ! mkdir Smiles Tokenizer_Pub Chem_10M_Clin Tox_run | deepchem.pdf |
# Train the model model. train_model ( train_df, eval_df = valid_df, output_dir = '/content/Smiles Tokenizer_Pub Chem_10M_Clin Tox_run', args = { INFO:simpletransformers. classification. classification_utils: Converting to features started. Cache is not used. 0%| | 0/2 [00:00<?, ?it/s] INFO:simpletransformers. classification. classification_utils: Saving features into cached file cache_dir/cached_t rain_roberta_128_2_1182 Epoch: 0%| | 0/15 [00:00<?, ?it/s] INFO:simpletransformers. classification. classification_model: Initializing Wand B run for training. Finishing last run (ID:7bl6wyef) before initializing another... Waiting for W&B process to finish, PID 4711 Program ended successfully. VBox(children=(Label(value=' 0. 00MB of 0. 00MB uploaded (0. 00MB deduped)\r'), Float Progress(value=1. 0, max=1. 0)... Find user logs for this run at: /content/bert-loves-chemistry/wandb/run-20210318_145541-7bl6wyef/logs/debug. log Find internal logs for this run at: /content/bert-loves-chemistry/wandb/run-20210318_145541-7bl6wyef/logs/debug-internal. log Run summary: _runtime 3 _timestamp 1616079348 _step 2 Run history: _runtime βββ _timestamp βββ _step ββ
β Synced 5 W&B file(s), 3 media file(s), 0 artifact file(s) and 0 other file(s) Synced vivid-morning-252 : https://wandb. ai/seyonec/project-name/runs/7bl6wyef... Successfully finished last run (ID:7bl6wyef). Initializing new run: Tracking run with wandb version 0. 10. 22 Syncing run revived-armadillo-253 to Weights & Biases (Documentation). Project page: https://wandb. ai/seyonec/project-name Run page: https://wandb. ai/seyonec/project-name/runs/v04qi4gi Run data is saved locally in /content/bert-loves-chemistry/wandb/run-20210318_145608-v04qi4gi Running Epoch 0 of 15: 0%| | 0/148 [00:00<?, ?it/s] /usr/local/lib/python3. 7/dist-packages/torch/nn/modules/module. py:760: User Warning: Using non-full backward hook s on a Module that does not return a single Tensor or a tuple of Tensors is deprecated and will be removed in fu ture versions. This hook will be missing some of the grad_output. Please use register_full_backward_hook to get the documented behavior. warnings. warn("Using non-full backward hooks on a Module that does not return a " /usr/local/lib/python3. 7/dist-packages/torch/nn/modules/module. py:795: User Warning: Using a non-full backward ho ok when the forward contains multiple autograd Nodes is deprecated and will be removed in future versions. This hook will be missing some grad_input. Please use register_full_backward_hook to get the documented behavior. warnings. warn("Using a non-full backward hook when the forward contains multiple autograd Nodes " Running Epoch 1 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 2 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 3 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 4 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 5 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 6 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 7 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 8 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 9 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 10 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 11 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 12 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 13 of 15: 0%| | 0/148 [00:00<?, ?it/s] Running Epoch 14 of 15: 0%| | 0/148 [00:00<?, ?it/s] INFO:simpletransformers. classification. classification_model: Training of roberta model complete. Saved to /conte nt/Smiles Tokenizer_Pub Chem_10M_Clin Tox_run. | deepchem.pdf |
(2220, 0. 09892498987772685) Let's install scikit-learn now, to evaluate the model we've trained. We will be using the accuracy and PRC-AUC metrics (average precision score). import sklearn # accuracy result, model_outputs, wrong_predictions = model. eval_model ( test_df, acc = sklearn. metrics. accuracy_score ) # ROC-PRC result, model_outputs, wrong_predictions = model. eval_model ( test_df, acc = sklearn. metrics. average_precision_score ) INFO:simpletransformers. classification. classification_utils: Converting to features started. Cache is not used. INFO:simpletransformers. classification. classification_utils: Saving features into cached file cache_dir/cached_d ev_roberta_128_2_148 Running Evaluation: 0%| | 0/19 [00:00<?, ?it/s] INFO:simpletransformers. classification. classification_model: Initializing Wand B run for evaluation. Finishing last run (ID:v04qi4gi) before initializing another... Waiting for W&B process to finish, PID 4767 Program ended successfully. VBox(children=(Label(value=' 0. 01MB of 0. 01MB uploaded (0. 00MB deduped)\r'), Float Progress(value=1. 0, max=1. 0)... Find user logs for this run at: /content/bert-loves-chemistry/wandb/run-20210318_145608-v04qi4gi/logs/debug. log Find internal logs for this run at: /content/bert-loves-chemistry/wandb/run-20210318_145608-v04qi4gi/logs/debug-internal. log Run summary: Training loss 0. 11875 lr 0. 0 global_step 2200 _runtime 175 _timestamp 1616079546 _step 43 Run history: Training loss ββ
βββββ
βββββββββββββββββββββββββββββββββ lr βββββββββββββββββ
β
β
β
β
βββββββββββββββββββ global_step βββββββββββββββββββββ
β
β
β
β
β
ββββββββββββββ _runtime βββββββββββββββββββββ
β
β
β
β
β
ββββββββββββββ _timestamp βββββββββββββββββββββ
β
β
β
β
β
ββββββββββββββ _step βββββββββββββββββββββ
β
β
β
β
β
ββββββββββββββ Synced 5 W&B file(s), 1 media file(s), 0 artifact file(s) and 0 other file(s) Synced revived-armadillo-253 : https://wandb. ai/seyonec/project-name/runs/v04qi4gi... Successfully finished last run (ID:v04qi4gi). Initializing new run: Tracking run with wandb version 0. 10. 22 Syncing run pleasant-wave-254 to Weights & Biases (Documentation). Project page: https://wandb. ai/seyonec/project-name Run page: https://wandb. ai/seyonec/project-name/runs/3ti3lfl8 Run data is saved locally in /content/bert-loves-chemistry/wandb/run-20210318_145908-3ti3lfl8 INFO:simpletransformers. classification. classification_model:{'mcc': 0. 3646523331752495, 'tp': 138, 'tn': 2, 'fp' : 7, 'fn': 1, 'auroc': 0. 8073541167066347, 'auprc': 0. 984400271563181, 'acc': 0. 9459459459459459, 'eval_loss': 0. 3173560830033047} INFO:simpletransformers. classification. classification_utils: Converting to features started. Cache is not used. INFO:simpletransformers. classification. classification_utils: Saving features into cached file cache_dir/cached_d ev_roberta_128_2_148 | deepchem.pdf |
Running Evaluation: 0%| | 0/19 [00:00<?, ?it/s] INFO:simpletransformers. classification. classification_model: Initializing Wand B run for evaluation. Finishing last run (ID:3ti3lfl8) before initializing another... Waiting for W&B process to finish, PID 4826 Program ended successfully. VBox(children=(Label(value=' 0. 00MB of 0. 00MB uploaded (0. 00MB deduped)\r'), Float Progress(value=0. 76956648239... Find user logs for this run at: /content/bert-loves-chemistry/wandb/run-20210318_145908-3ti3lfl8/logs/debug. log Find internal logs for this run at: /content/bert-loves-chemistry/wandb/run-20210318_145908-3ti3lfl8/logs/debug-internal. log Run summary: _runtime 3 _timestamp 1616079554 _step 2 Run history: _runtime βββ _timestamp βββ _step ββ
β Synced 5 W&B file(s), 3 media file(s), 0 artifact file(s) and 0 other file(s) Synced pleasant-wave-254 : https://wandb. ai/seyonec/project-name/runs/3ti3lfl8... Successfully finished last run (ID:3ti3lfl8). Initializing new run: Tracking run with wandb version 0. 10. 22 Syncing run dulcet-shadow-255 to Weights & Biases (Documentation). Project page: https://wandb. ai/seyonec/project-name Run page: https://wandb. ai/seyonec/project-name/runs/17769dhr Run data is saved locally in /content/bert-loves-chemistry/wandb/run-20210318_145914-17769dhr INFO:simpletransformers. classification. classification_model:{'mcc': 0. 3646523331752495, 'tp': 138, 'tn': 2, 'fp' : 7, 'fn': 1, 'auroc': 0. 8073541167066347, 'auprc': 0. 984400271563181, 'acc': 0. 951633958443683, 'eval_loss': 0. 3173560830033047} The model performs incredibly well, averaging above 96% PRC-AUC after training on only ~1400 data samples and 150 positive leads in a couple of minutes! This model was also trained on 1/10th the amount of pre-training data as the Pub Chem-10M BPE model we used previously, but it still showcases robust performance. We can clearly see the predictive power of transfer learning, and approaches like these are becoming increasing popular in the pharmaceutical industry where larger datasets are scarce. By training on more epochs and tasks, we can probably boost the accuracy as well! Lets evaluate the model on one last string from Clin Tox's test set for toxicity. The model should predict 1, meaning the drug failed clinical trials for toxicity reasons and wasn't approved by the FDA. # Lets input a molecule with a toxicity value of 1 predictions, raw_outputs = model. predict ([ 'C1=C(C(=O)NC(=O)N1)F' ]) INFO:simpletransformers. classification. classification_utils: Converting to features started. Cache is not used. INFO:simpletransformers. classification. classification_utils: Saving features into cached file cache_dir/cached_d ev_roberta_128_2_1 0%| | 0/1 [00:00<?, ?it/s] print ( predictions ) print ( raw_outputs ) [1] [[-4. 546875 4. 83984375]] The model predicts the sample correctly! Some future tasks may include using the same model on multiple tasks (Tox21 provides multiple tasks relating to different biochemical pathways for toxicity, as an example), through multi-task classification, as well as training on a larger dataset such as HIV, one of the other harder tasks in molecular machine | deepchem.pdf |
learning. This will be expanded on in future work! Congratulations! Time to join the Community! Congratulations on completing this tutorial notebook! If you enjoyed working through the tutorial, and want to continue working with Deep Chem, we encourage you to finish the rest of the tutorials in this series. You can also help the Deep Chem community in the following ways: Star Deep Chem on Github This helps build awareness of the Deep Chem project and the tools for open source drug discovery that we're trying to build. Join the Deep Chem Gitter The Deep Chem Gitter hosts a number of scientists, developers, and enthusiasts interested in deep learning for the life sciences. Join the conversation! | deepchem.pdf |
Training a Normalizing Flow on QM9 By Nathan C. Frey | Twitter In this tutorial, we will train a Normalizing Flow (NF) on the QM9 dataset. The dataset comprises 133,885 stable small organic molecules made up of CHNOF atoms. We will try to train a network that is an invertible transformation between a simple base distribution and the distribution of molecules in QM9. One of the key advantages of normalizing flows is that they can be constructed to efficiently sample from a distribution (generative modeling) and do probability density calculations (exactly compute log-likelihoods), whereas other models make tradeoffs between the two or can only approximate probability densities. This work has been published and considered as Fast Flows see reference. NFs are useful whenever we need a probabilistic model with one or both of these capabilities. Note that because NFs are completely invertible, there is no "latent space" in the sense used when referring to generative adversarial networks or variational autoencoders. For more on NFs, we refer to this review paper. To encode the QM9 dataset, we'll make use of the SELFIES (SELF-referenc Ing Embedded Strings) representation, which is a 100% robust molecular string representation. SMILES strings produced by generative models are often syntactically invalid (they do not correspond to a molecular graph), or they violate chemical rules like the maximum number of bonds between atoms. SELFIES are designed so that even totally random SELFIES strings correspond to valid molecular graphs, so they are a great framework for generative modeling. For more details about SELFIES, see the Git Hub repo and the associated paper. Colab This tutorial and the rest in this sequence are designed to be done in Google colab. If you'd like to open this notebook in colab, you can use the following link. O p e n i n C o l a b O p e n i n C o l a b Setup To run Deep Chem within Colab, you'll need to run the following cell of installation commands. This will take about 5 minutes to run to completion and install your environment. ! pip install --pre deepchem import deepchem deepchem. __version__ Requirement already satisfied: deepchem in /usr/local/lib/python3. 7/dist-packages (2. 6. 1) Requirement already satisfied: pandas in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 3. 5) Requirement already satisfied: scikit-learn in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 0. 2) Requirement already satisfied: numpy>=1. 21 in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 21. 5) Requirement already satisfied: scipy in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 4. 1) Requirement already satisfied: joblib in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 1. 0) Requirement already satisfied: rdkit-pypi in /usr/local/lib/python3. 7/dist-packages (from deepchem) (2021. 9. 5. 1) Requirement already satisfied: pytz>=2017. 3 in /usr/local/lib/python3. 7/dist-packages (from pandas->deepchem) (2 018. 9) Requirement already satisfied: python-dateutil>=2. 7. 3 in /usr/local/lib/python3. 7/dist-packages (from pandas->de epchem) (2. 8. 2) Requirement already satisfied: six>=1. 5 in /usr/local/lib/python3. 7/dist-packages (from python-dateutil>=2. 7. 3-> pandas->deepchem) (1. 15. 0) Requirement already satisfied: Pillow in /usr/local/lib/python3. 7/dist-packages (from rdkit-pypi->deepchem) (7. 1. 2) Requirement already satisfied: threadpoolctl>=2. 0. 0 in /usr/local/lib/python3. 7/dist-packages (from scikit-learn->deepchem) (3. 1. 0) '2. 6. 1' Install the SELFIES library to translate SMILES strings. ! pip install selfies Requirement already satisfied: selfies in /usr/local/lib/python3. 7/dist-packages (2. 0. 0) import numpy as np import matplotlib. pyplot as plt import seaborn as sns import pandas as pd import os import deepchem as dc from deepchem. models. normalizing_flows import Normalizing Flow, Normalizing Flow Model | deepchem.pdf |
from deepchem. models. optimizers import Adam from deepchem. data import Numpy Dataset from deepchem. splits import Random Splitter from deepchem. molnet import load_tox21 import rdkit from rdkit import Chem from rdkit. Chem. Draw import IPython Console from rdkit. Chem import Draw from IPython. display import Image, display import selfies as sf import tensorflow as tf import tensorflow_probability as tfp tfd = tfp. distributions tfb = tfp. bijectors tfk = tf. keras tfk. backend. set_floatx ( 'float64' ) First, let's get a dataset of 2500 small organic molecules from the QM9 dataset. We'll then convert the molecules to SELFIES, one-hot encode them, and dequantize the inputs so they can be processed by a normalizing flow. 2000 molecules will be used for training, while the remaining 500 will be split into validation and test sets. We'll use the validation set to see how our architecture is doing at learning the underlying the distribution, and leave the test set alone. You should feel free to experiment with this notebook to get the best model you can and evaluate it on the test set when you're done! # Download from Mol Net tasks, datasets, transformers = dc. molnet. load_qm9 ( featurizer = 'ECFP' ) df = pd. Data Frame ( data = { 'smiles' : datasets [ 0 ]. ids }) data = df [[ 'smiles' ]]. sample ( 2500, random_state = 42 ) SELFIES defines a dictionary called bond_constraints that enforces how many bonds every atom or ion can make. E. g., 'C': 4, 'H': 1, etc. The ? symbol is used for any atom or ion that isn't defined in the dictionary, and it defaults to 8 bonds. Because QM9 contains ions and we don't want to allow those ions to form up to 8 bonds, we'll constrain them to 3. This will really improve the percentage of valid molecules we generate. You can read more about setting constraints in the SELFIES documentation. sf. set_semantic_constraints () # reset constraints constraints = sf. get_semantic_constraints () constraints [ '?' ] = 3 sf. set_semantic_constraints ( constraints ) constraints {'?': 3, 'B': 3, 'B+1': 2, 'B-1': 4, 'Br': 1, 'C': 4, 'C+1': 5, 'C-1': 3, 'Cl': 1, 'F': 1, 'H': 1, 'I': 1, 'N': 3, 'N+1': 4, 'N-1': 2, 'O': 2, 'O+1': 3, 'O-1': 1, 'P': 5, 'P+1': 6, 'P-1': 4, 'S': 6, 'S+1': 7, 'S-1': 5} def preprocess_smiles ( smiles ): return sf. encoder ( smiles ) def keys_int ( symbol_to_int ): d = {} | deepchem.pdf |
i = 0 for key in symbol_to_int. keys (): d [ i ] = key i += 1 return d data [ 'selfies' ] = data [ 'smiles' ]. apply ( preprocess_smiles ) Let's take a look at some short SMILES strings and their corresponding SELFIES representations. We can see right away that there is a key difference in how the two representations deal with Rings and Branches. SELFIES is designed so that branch length and ring size are stored locally with the Branch and Ring identifiers, and the SELFIES grammar prevents invalid strings. data [ 'len' ] = data [ 'smiles' ]. apply ( lambda x : len ( x )) data. sort_values ( by = 'len' ). head () smiles selfies len 6728 [H]c1nnc([H])o1 [H][C][=N][N][=C][Branch1][C][H][O][Ring1][=Br... 15 72803 [H]c1nnnc([H])n1 [H][C][=N][N][=N][C][Branch1][C][H][=N][Ring1]... 16 97670 [H]c1onnc1C(F)(F)F [H][C][O][N][=N][C][=Ring1][Branch1][C][Branch... 18 25487 [H]n1nnc(C#CC#N)n1 [H][N][N][=N][C][Branch1][Branch1][C][#C][C][#... 18 32004 [H]C#Cc1nnc(F)nc1[H] [H][C][#C][C][=N][N][=C][Branch1][C][F][N][=C]... 20 To convert SELFIES to a one-hot encoded representation, we need to construct an alphabet of all the characters that occur in the list of SELFIES strings. We also have to know what the longest SELFIES string is, so that all the shorter SELFIES can be padded with '[nop]' to be equal length. selfies_list = np. asanyarray ( data. selfies ) selfies_alphabet = sf. get_alphabet_from_selfies ( selfies_list ) selfies_alphabet. add ( '[nop]' ) # Add the "no operation" symbol as a padding character selfies_alphabet. add ( '. ' ) selfies_alphabet = list ( sorted ( selfies_alphabet )) largest_selfie_len = max ( sf. len_selfies ( s ) for s in selfies_list ) symbol_to_int = dict (( c, i ) for i, c in enumerate ( selfies_alphabet )) int_mol = keys_int ( symbol_to_int ) selfies has a handy utility function to translate SELFIES strings into one-hot encoded vectors. onehots = sf. batch_selfies_to_flat_hot ( selfies_list, symbol_to_int, largest_selfie_len ) Next, we "dequantize" the inputs by adding random noise from the interval [0, 1) to every input in the encodings. This allows the normalizing flow to operate on continuous inputs (rather than discrete), and the original inputs can easily be recovered by applying a floor function. input_tensor = tf. convert_to_tensor ( onehots, dtype = 'float64' ) noise_tensor = tf. random. uniform ( shape = input_tensor. shape, minval = 0, maxval = 1, dtype = 'float64' ) dequantized_data = tf. add ( input_tensor, noise_tensor ) The dequantized data is ready to be processed as a Deep Chem dataset and split into training, validation, and test sets. We'll also keep track of the SMILES strings for the training set so we can compare the training data to our generated molecules later on. ds = Numpy Dataset ( dequantized_data ) # Create a Deep Chem dataset splitter = Random Splitter () train, val, test = splitter. train_valid_test_split ( dataset = ds, seed = 42 ) train_idx, val_idx, test_idx = splitter. split ( dataset = ds, seed = 42 ) dim = len ( train. X [ 0 ]) # length of one-hot encoded vectors train. X. shape # 2000 samples, N-dimensional one-hot vectors that represent molecules (2000, 2596) # SMILES strings of training data train_smiles = data [ 'smiles' ]. iloc [ train_idx ]. values Next we'll set up the normalizing flow model. The base distribution is a multivariate Normal distribution. The permutation layer permutes the dimensions of the input so that the normalizing flow layers will operate along multiple dimensions of the inputs. To understand why the permutation is needed, we need to know a bit about how the normalizing flow architecture works. base_dist = tfd. Multivariate Normal Diag ( loc = np. zeros ( dim ), scale_diag = np. ones ( dim )) | deepchem.pdf |
if dim % 2 == 0 : permutation = tf. cast ( np. concatenate (( np. arange ( dim / 2, dim ), np. arange ( 0, dim / 2 ))), tf. int32 ) else : permutation = tf. cast ( np. concatenate (( np. arange ( dim / 2 + 1, dim ), np. arange ( 0, dim / 2 ))), tf. int32 ) For this simple example, we'll set up a flow of repeating Masked Autoregressive Flow layers. The autoregressive property is enforced by using the Masked Autoencoder for Distribution Estimation architecture. The layers of the flow are a bijector, an invertible mapping between the base and target distributions. MAF takes the inputs from the base distribution and transforms them with a simple scale-and-shift (affine) operation, but crucially the scale-and-shift for each dimension of the output depends on the previously generated dimensions of the output. That independence of future dimensions preserves the autoregressive property and ensures that the normalizing flow is invertible. Now we can see that we need permutations to change the ordering of the inputs, or else the normalizing flow would only transform certain dimensions of the inputs. Batch Normalization layers can be added for additional stability in training, but may have strange effects on the outputs and require some input reshaping to work properly. Increasing num_layers and hidden_units can make more expressive flows capable of modeling more complex target distributions. num_layers = 8 flow_layers = [] Made = tfb. Autoregressive Network ( params = 2, hidden_units = [ 512, 512 ], activation = 'relu' ) for i in range ( num_layers ): flow_layers. append ( ( tfb. Masked Autoregressive Flow ( shift_and_log_scale_fn = Made ) )) permutation = tf. cast ( np. random. permutation ( np. arange ( 0, dim )), tf. int32 ) flow_layers. append ( tfb. Permute ( permutation = permutation )) # if (i + 1) % int(2) == 0: # flow_layers. append(tfb. Batch Normalization()) We can draw samples from the untrained distribution, but for now they don't have any relation to the QM9 dataset distribution. %% time nf = Normalizing Flow ( base_distribution = base_dist, flow_layers = flow_layers ) CPU times: user 280 ms, sys: 10. 2 ms, total: 290 ms Wall time: 289 ms A Normalizing Flow Model takes a Normalizing Flow and any parameters used by deepchem. models. Keras Model. nfm = Normalizing Flow Model ( nf, learning_rate = 1e-4, batch_size = 128 ) Now to train the model! We'll try to minimize the negative log likelihood loss, which measures the likelihood that generated samples are drawn from the target distribution, i. e. as we train the model, it should get better at modeling the target distribution and it will generate samples that look like molecules from the QM9 dataset. losses = [] val_losses = [] %% time max_epochs = 10 # maximum number of epochs of the training for epoch in range ( max_epochs ): loss = nfm. fit ( train, nb_epoch = 1, all_losses = losses ) val_loss = nfm. create_nll ( val. X ) val_losses. append ( val_loss. numpy ()) WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap | deepchem.pdf |
e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). | deepchem.pdf |
WARNING:tensorflow:Model was constructed with shape (None, 2596) for input Keras Tensor(type_spec=Tensor Spec(shap e=(None, 2596), dtype=tf. float64, name='input_1'), name='input_1', description="created by layer 'input_1'"), bu t it was called on an input with incompatible shape (1, 128, 2596). CPU times: user 13min 40s, sys: 20. 9 s, total: 14min 1s Wall time: 7min 27s f, ax = plt. subplots () ax. scatter ( range ( len ( losses )), losses, label = 'train loss' ) ax. scatter ( range ( len ( val_losses )), val_losses, label = 'val loss' ) plt. legend ( loc = 'upper right' ); The normalizing flow is learning a mapping between the multivariate Gaussian and the target distribution! We can see this by visualizing the loss on the validation set. We can now use nfm. flow. sample() to generate new QM9-like molecules and nfm. flow. log_prob() to evaluate the likelihood that a molecule was drawn from the underlying distribution. generated_samples = nfm. flow. sample ( 10 ) # generative modeling log_probs = nfm. flow. log_prob ( generated_samples ) # probability density estimation Now we transform the generated samples back into SELFIES. We have to quantize the outputs and add padding characters to any one-hot encoding vector that has all zeros. mols = tf. math. floor ( generated_samples ) # quantize data mols = tf. clip_by_value ( mols, 0, 1 ) # Set negative values to 0 and values > 1 to 1 mols_list = mols. numpy (). tolist () # Add padding characters if needed for mol in mols_list : for i in range ( largest_selfie_len ): row = mol [ len ( selfies_alphabet ) * i : len ( selfies_alphabet ) * ( i + 1 )] if all ( elem == 0 for elem in row ): mol [ len ( selfies_alphabet ) * ( i + 1 ) - 1 ] = 1 selfies has another utility function to translate one-hot encoded representations back to SELFIES strings. mols = sf. batch_flat_hot_to_selfies ( mols_list, int_mol ) We can use RDKit to find valid generated molecules. Some have unphysical valencies and should be discarded. If you've ever tried to generate valid SMILES strings, you'll notice right away that this model is doing much better than we would expect! Using SELFIES, 90% of the generated molecules are valid, even though our normalizing flow architecture doesn't know any rules that govern chemical validity. from rdkit import RDLogger from rdkit import Chem RDLogger. Disable Log ( 'rd App. *' ) # suppress error messages valid_count = 0 valid_selfies, invalid_selfies = [], [] for idx, selfies in enumerate ( mols ): try : if Chem. Mol From Smiles ( sf. decoder ( mols [ idx ]), sanitize = True ) is not None : valid_count += 1 valid_selfies. append ( selfies ) else : invalid_selfies. append ( selfies ) except Exception : pass print ( ' %. 2f ' % ( valid_count / len ( mols )), ' % o f generated samples are valid molecules. ' ) 1. 00 % of generated samples are valid molecules. Let's take a look at some of the generated molecules! We'll borrow some helper functions from the Modeling Solubility tutorial to display molecules with RDKit. | deepchem.pdf |
gen_mols = [ Chem. Mol From Smiles ( sf. decoder ( vs )) for vs in valid_selfies ] def display_images ( filenames ): """Helper to pretty-print images. """ for file in filenames : display ( Image ( file )) def mols_to_pngs ( mols, basename = "generated_mol" ): """Helper to write RDKit mols to png files. """ filenames = [] for i, mol in enumerate ( mols ): filename = " %s%d. png" % ( basename, i ) Draw. Mol To File ( mol, filename ) filenames. append ( filename ) return filenames display_mols = [] for i in range ( 10 ): display_mols. append ( gen_mols [ i ]) display_images ( mols_to_pngs ( display_mols )) | deepchem.pdf |
deepchem.pdf |
|
Finally, we can compare generated molecules with our training data via a similarity search with Tanimoto similarity. This gives an indication of how "original" the generated samples are, versus simply producing samples that are extremely similar to molecules the model has already seen. We have to keep in mind that QM9 contains all stable small molecules with up to 9 heavy atoms (CONF). So anything new we generate either exists in the full QM9 dataset, or else will not obey the charge neutrality and stability criteria used to generated QM9. from rdkit. Chem. Fingerprints. Fingerprint Mols import Fingerprint Mol from rdkit. Data Structs import Fingerprint Similarity from IPython. display import display def tanimoto_similarity ( database_mols, query_mol ): """Compare generated molecules to database by Tanimoto similarity. """ # convert Mol to datastructure type fps = [ Fingerprint Mol ( m ) for m in database_mols ] # set a query molecule to compare against database query = Fingerprint Mol ( query_mol ) similarities = [] # loop through to find Tanimoto similarity for idx, f in enumerate ( fps ): # tuple: (idx, similarity) | deepchem.pdf |
similarities. append (( idx, Fingerprint Similarity ( query, f ))) # sort sim using the similarities similarities. sort ( key = lambda x : x [ 1 ], reverse = True ) return similarities We'll consider our generated molecules and look at the top 3 most similar molecules from the training data by Tanimoto similarity. Here's an example where the Tanimoto similarity scores are medium. There are molecules in our training set that are similar to our generated sample. This might be interesting, or it might mean that the generated molecule is unrealistic. train_mols = [ Chem. Mol From Smiles ( smiles ) for smiles in train_smiles ] # change the second argument to compare different generated molecules to QM9 tanimoto_scores = tanimoto_similarity ( train_mols, gen_mols [ 3 ]) similar_mols = [] for idx, ts in tanimoto_scores [: 3 ]: print ( round ( ts, 3 )) similar_mols. append ( train_mols [ idx ]) display_images ( mols_to_pngs ( similar_mols, 'qm9_mol' )) 0. 521 0. 471 0. 468 | deepchem.pdf |
Molecules of the previous tutorial: | deepchem.pdf |
deepchem.pdf |
|
These molecules were obteined through sampling. Comparing with the tanimoto similarity: | deepchem.pdf |
With scores of: 0. 243 0. 243 0. 241 Further reading So far we have looked at a measure of validity and done a bit of investigation into the novelty of the generated compounds. There are more dimensions along which we can and should evaluate the performance of a generative model. For an example of some standard benchmarks, see the Guaca Mol evaluation framework. For more information about Fast Flows look at this paper where the workflow is crearly explained. For examples of normalizing flow-based molecular graph generation frameworks, check out the Mo Flow, Graph AF, and Graph NVP papers. Congratulations! Time to join the Community! Congratulations on completing this tutorial notebook! If you enjoyed working through the tutorial, and want to continue working with Deep Chem, we encourage you to finish the rest of the tutorials in this series. You can also help the Deep Chem community in the following ways: Star Deep Chem on Git Hub This helps build awareness of the Deep Chem project and the tools for open source drug discovery that we're trying to build. Join the Deep Chem Gitter The Deep Chem Gitter hosts a number of scientists, developers, and enthusiasts interested in deep learning for the life sciences. Join the conversation! | deepchem.pdf |
Screening Zinc For HIV Inhibition In this tutorial we will walk through how to efficiently screen a large compound library with Deep Chem (ZINC). Screening a large compound library using machine learning is a CPU bound pleasingly parrellel problem. The actual code examples I will use assume the resources available are a single very big machine (like an AWS c5. 18xlarge), but should be readily swappable for other systmes (like a super computing cluster). At a high level what we will do is... 1. Create a Machine Learning Model Over Labeled Data 2. Transform ZINC into "Work-Units" 3. Create an inference script which runs predictions over a "Work-Unit" 4. Load "Work-Unit" into a "Work Queue" 5. Consume work units from "Work Queue" 6. Gather Results This tutorial is unlike the previous tutorials in that it's designed to be run on AWS rather than on Google Colab. That's because we'll need access to a large machine with many cores to do this computation efficiently. We'll try to provide details about how to do this throughout the tutorial. 1. Train Model On Labelled Data We are just going to knock out a simple model here. In a real world problem you will probably try several models and do a little hyper parameter searching. from deepchem. molnet. load_function import hiv_datasets | deepchem.pdf |
/Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/sklearn/externals/joblib/__init__. py:15: Future Warning: sklearn. externals. joblib is deprecated in 0. 21 and will be removed in 0. 23. Please import this fu nctionality directly from joblib, which can be installed with: pip install joblib. If this warning is raised whe n loading pickled models, you may need to re-serialize those models with scikit-learn 0. 21+. warnings. warn(msg, category=Future Warning) RDKit WARNING: [18:15:24] Enabling RDKit 2019. 09. 3 jupyter extensions /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python/framework/dtypes. py:516 : Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. _np_qint8 = np. dtype([("qint8", np. int8, 1)]) /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python/framework/dtypes. py:517 : Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. _np_quint8 = np. dtype([("quint8", np. uint8, 1)]) /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python/framework/dtypes. py:518 : Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. _np_qint16 = np. dtype([("qint16", np. int16, 1)]) /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python/framework/dtypes. py:519 : Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. _np_quint16 = np. dtype([("quint16", np. uint16, 1)]) /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python/framework/dtypes. py:520 : Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. _np_qint32 = np. dtype([("qint32", np. int32, 1)]) /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python/framework/dtypes. py:525 : Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. np_resource = np. dtype([("resource", np. ubyte, 1)]) /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorboard/compat/tensorflow_stub/dtypes. py:541: Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. _np_qint8 = np. dtype([("qint8", np. int8, 1)]) /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorboard/compat/tensorflow_stub/dtypes. py:542: Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. _np_quint8 = np. dtype([("quint8", np. uint8, 1)]) /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorboard/compat/tensorflow_stub/dtypes. py:543: Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. _np_qint16 = np. dtype([("qint16", np. int16, 1)]) /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorboard/compat/tensorflow_stub/dtypes. py:544: Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. _np_quint16 = np. dtype([("quint16", np. uint16, 1)]) /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorboard/compat/tensorflow_stub/dtypes. py:545: Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. _np_qint32 = np. dtype([("qint32", np. int32, 1)]) /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorboard/compat/tensorflow_stub/dtypes. py:550: Future Warning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'. np_resource = np. dtype([("resource", np. ubyte, 1)]) from deepchem. models import Graph Conv Model from deepchem. data import Numpy Dataset from sklearn. metrics import average_precision_score import numpy as np tasks, all_datasets, transformers = hiv_datasets. load_hiv ( featurizer = "Graph Conv" ) train, valid, test = [ Numpy Dataset. from_Disk Dataset ( x ) for x in all_datasets ] model = Graph Conv Model ( 1, mode = "classification" ) model. fit ( train ) | deepchem.pdf |
Loading raw samples now. shard_size: 8192 About to start loading CSV from /var/folders/st/ds45jcqj2232lvhr0y9qt5sc0000gn/T/HIV. csv Loading shard 1 of size 8192. Featurizing sample 0 Featurizing sample 1000 Featurizing sample 2000 Featurizing sample 3000 Featurizing sample 4000 Featurizing sample 5000 Featurizing sample 6000 Featurizing sample 7000 Featurizing sample 8000 TIMING: featurizing shard 0 took 12. 479 s Loading shard 2 of size 8192. Featurizing sample 0 Featurizing sample 1000 Featurizing sample 2000 Featurizing sample 3000 Featurizing sample 4000 Featurizing sample 5000 Featurizing sample 6000 Featurizing sample 7000 Featurizing sample 8000 TIMING: featurizing shard 1 took 13. 668 s Loading shard 3 of size 8192. Featurizing sample 0 Featurizing sample 1000 Featurizing sample 2000 Featurizing sample 3000 Featurizing sample 4000 Featurizing sample 5000 Featurizing sample 6000 Featurizing sample 7000 Featurizing sample 8000 TIMING: featurizing shard 2 took 13. 550 s Loading shard 4 of size 8192. Featurizing sample 0 Featurizing sample 1000 Featurizing sample 2000 Featurizing sample 3000 Featurizing sample 4000 Featurizing sample 5000 Featurizing sample 6000 Featurizing sample 7000 Featurizing sample 8000 TIMING: featurizing shard 3 took 13. 173 s Loading shard 5 of size 8192. Featurizing sample 0 Featurizing sample 1000 Featurizing sample 2000 RDKit WARNING: [18:16:53] WARNING: not removing hydrogen atom without neighbors RDKit WARNING: [18:16:53] WARNING: not removing hydrogen atom without neighbors Featurizing sample 3000 Featurizing sample 4000 Featurizing sample 5000 Featurizing sample 6000 Featurizing sample 7000 Featurizing sample 8000 TIMING: featurizing shard 4 took 13. 362 s Loading shard 6 of size 8192. Featurizing sample 0 TIMING: featurizing shard 5 took 0. 355 s TIMING: dataset construction took 80. 394 s Loading dataset from disk. TIMING: dataset construction took 16. 676 s Loading dataset from disk. TIMING: dataset construction took 7. 529 s Loading dataset from disk. TIMING: dataset construction took 7. 796 s Loading dataset from disk. TIMING: dataset construction took 17. 521 s Loading dataset from disk. TIMING: dataset construction took 7. 770 s Loading dataset from disk. TIMING: dataset construction took 7. 873 s Loading dataset from disk. TIMING: dataset construction took 15. 495 s Loading dataset from disk. TIMING: dataset construction took 1. 959 s Loading dataset from disk. TIMING: dataset construction took 1. 949 s Loading dataset from disk. | deepchem.pdf |
WARNING:tensorflow:From /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python /ops/init_ops. py:1251: calling Variance Scaling. __init__ (from tensorflow. python. ops. init_ops) with dtype is depr ecated and will be removed in a future version. Instructions for updating: Call initializer instance with the dtype argument instead of passing it to the constructor WARNING:tensorflow:Entity <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a3e35c0 48>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a3e35c048>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a3e35c048>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, s et the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a3e35c048>>: Attribute Error: modu le 'gast' has no attribute 'Num' WARNING:tensorflow:Entity <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a41856e 80>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a41856e80>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a41856e80>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, s et the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a41856e80>>: Attribute Error: modu le 'gast' has no attribute 'Num' WARNING:tensorflow:Entity <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a49f5aa 90>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a49f5aa90>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a49f5aa90>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, s et the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a49f5aa90>>: Attribute Error: modu le 'gast' has no attribute 'Num' WARNING:tensorflow:Entity <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a43f5d1 98>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a43f5d198>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a43f5d198>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, s et the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a43f5d198>>: Attribute Error: modu le 'gast' has no attribute 'Num' WARNING:tensorflow:Entity <bound method Graph Gather. call of <deepchem. models. layers. Graph Gather object at 0x1a43 f3a940>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When fi ling the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Ca use: converting <bound method Graph Gather. call of <deepchem. models. layers. Graph Gather object at 0x1a43f3a940>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Graph Gather. call of <deepchem. models. layers. Graph Gather object at 0x1a43f3a940>> c ould not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the b ug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: conve rting <bound method Graph Gather. call of <deepchem. models. layers. Graph Gather object at 0x1a43f3a940>>: Attribute E rror: module 'gast' has no attribute 'Num' WARNING:tensorflow:From /Users/bharath/Code/deepchem/deepchem/models/layers. py:222: The name tf. unsorted_segment _sum is deprecated. Please use tf. math. unsorted_segment_sum instead. WARNING:tensorflow:From /Users/bharath/Code/deepchem/deepchem/models/layers. py:224: The name tf. unsorted_segment _max is deprecated. Please use tf. math. unsorted_segment_max instead. WARNING:tensorflow:Entity <bound method Trim Graph Output. call of <deepchem. models. graph_models. Trim Graph Output ob ject at 0x1a41a9ecf8>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the fu ll output. Cause: converting <bound method Trim Graph Output. call of <deepchem. models. graph_models. Trim Graph Output object at 0x1a41a9ecf8>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Trim Graph Output. call of <deepchem. models. graph_models. Trim Graph Output object at 0x 1a41a9ecf8>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. Whe n filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Trim Graph Output. call of <deepchem. models. graph_models. Trim Graph Output object a t 0x1a41a9ecf8>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING:tensorflow:From /Users/bharath/Code/deepchem/deepchem/models/keras_model. py:169: The name tf. Session is deprecated. Please use tf. compat. v1. Session instead. WARNING:tensorflow:From /Users/bharath/Code/deepchem/deepchem/models/optimizers. py:76: The name tf. train. Adam Opt imizer is deprecated. Please use tf. compat. v1. train. Adam Optimizer instead. WARNING:tensorflow:From /Users/bharath/Code/deepchem/deepchem/models/keras_model. py:258: The name tf. global_vari ables is deprecated. Please use tf. compat. v1. global_variables instead. | deepchem.pdf |
WARNING:tensorflow:From /Users/bharath/Code/deepchem/deepchem/models/keras_model. py:260: The name tf. variables_i nitializer is deprecated. Please use tf. compat. v1. variables_initializer instead. WARNING:tensorflow:Entity <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a3e35c0 48>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a3e35c048>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a3e35c048>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, s et the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a3e35c048>>: Attribute Error: modu le 'gast' has no attribute 'Num' WARNING:tensorflow:Entity <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a41856e 80>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a41856e80>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a41856e80>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, s et the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a41856e80>>: Attribute Error: modu le 'gast' has no attribute 'Num' WARNING:tensorflow:Entity <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a49f5aa 90>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a49f5aa90>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a49f5aa90>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, s et the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Conv. call of <deepchem. models. layers. Graph Conv object at 0x1a49f5aa90>>: Attribute Error: modu le 'gast' has no attribute 'Num' WARNING:tensorflow:Entity <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a43f5d1 98>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a43f5d198>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a43f5d198>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, s et the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Graph Pool. call of <deepchem. models. layers. Graph Pool object at 0x1a43f5d198>>: Attribute Error: modu le 'gast' has no attribute 'Num' WARNING:tensorflow:Entity <bound method Graph Gather. call of <deepchem. models. layers. Graph Gather object at 0x1a43 f3a940>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When fi ling the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Ca use: converting <bound method Graph Gather. call of <deepchem. models. layers. Graph Gather object at 0x1a43f3a940>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Graph Gather. call of <deepchem. models. layers. Graph Gather object at 0x1a43f3a940>> c ould not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the b ug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: conve rting <bound method Graph Gather. call of <deepchem. models. layers. Graph Gather object at 0x1a43f3a940>>: Attribute E rror: module 'gast' has no attribute 'Num' WARNING:tensorflow:Entity <bound method Trim Graph Output. call of <deepchem. models. graph_models. Trim Graph Output ob ject at 0x1a41a9ecf8>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the fu ll output. Cause: converting <bound method Trim Graph Output. call of <deepchem. models. graph_models. Trim Graph Output object at 0x1a41a9ecf8>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING: Entity <bound method Trim Graph Output. call of <deepchem. models. graph_models. Trim Graph Output object at 0x 1a41a9ecf8>> could not be transformed and will be executed as-is. Please report this to the Autgo Graph team. Whe n filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: converting <bound method Trim Graph Output. call of <deepchem. models. graph_models. Trim Graph Output object a t 0x1a41a9ecf8>>: Attribute Error: module 'gast' has no attribute 'Num' WARNING:tensorflow:From /Users/bharath/Code/deepchem/deepchem/models/losses. py:108: The name tf. losses. softmax_c ross_entropy is deprecated. Please use tf. compat. v1. losses. softmax_cross_entropy instead. WARNING:tensorflow:From /Users/bharath/Code/deepchem/deepchem/models/losses. py:109: The name tf. losses. Reduction is deprecated. Please use tf. compat. v1. losses. Reduction instead. WARNING:tensorflow:From /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python /ops/math_grad. py:318: add_dispatch_support. <locals>. wrapper (from tensorflow. python. ops. array_ops) is deprecate d and will be removed in a future version. Instructions for updating: Use tf. where in 2. 0, which has the same broadcast rule as np. where | deepchem.pdf |
/Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python/ops/gradients_util. py:9 3: User Warning: Converting sparse Indexed Slices to a dense Tensor of unknown shape. This may consume a large amo unt of memory. "Converting sparse Indexed Slices to a dense Tensor of unknown shape. " /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python/ops/gradients_util. py:9 3: User Warning: Converting sparse Indexed Slices to a dense Tensor of unknown shape. This may consume a large amo unt of memory. "Converting sparse Indexed Slices to a dense Tensor of unknown shape. " /Users/bharath/opt/anaconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python/ops/gradients_util. py:9 3: User Warning: Converting sparse Indexed Slices to a dense Tensor of unknown shape. This may consume a large amo unt of memory. "Converting sparse Indexed Slices to a dense Tensor of unknown shape. " 0. 0 y_true = np. squeeze ( valid. y ) y_pred = model. predict ( valid )[:, 0, 1 ] print ( "Average Precision Score: %s " % average_precision_score ( y_true, y_pred )) sorted_results = sorted ( zip ( y_pred, y_true ), reverse = True ) hit_rate_100 = sum ( x [ 1 ] for x in sorted_results [: 100 ]) / 100 print ( "Hit Rate Top 100: %s " % hit_rate_100 ) Average Precision Score:0. 19783388433313015 Hit Rate Top 100: 0. 37 Retrain Model Over Full Dataset For The Screen tasks, all_datasets, transformers = hiv_datasets. load_hiv ( featurizer = "Graph Conv", split = None ) model = Graph Conv Model ( 1, mode = "classification", model_dir = "/tmp/zinc/screen_model" ) model. fit ( all_datasets [ 0 ]) | deepchem.pdf |
Loading raw samples now. shard_size: 8192 About to start loading CSV from /tmp/HIV. csv Loading shard 1 of size 8192. Featurizing sample 0 Featurizing sample 1000 Featurizing sample 2000 Featurizing sample 3000 Featurizing sample 4000 Featurizing sample 5000 Featurizing sample 6000 Featurizing sample 7000 Featurizing sample 8000 TIMING: featurizing shard 0 took 15. 701 s Loading shard 2 of size 8192. Featurizing sample 0 Featurizing sample 1000 Featurizing sample 2000 Featurizing sample 3000 Featurizing sample 4000 Featurizing sample 5000 Featurizing sample 6000 Featurizing sample 7000 Featurizing sample 8000 TIMING: featurizing shard 1 took 15. 869 s Loading shard 3 of size 8192. Featurizing sample 0 Featurizing sample 1000 Featurizing sample 2000 Featurizing sample 3000 Featurizing sample 4000 Featurizing sample 5000 Featurizing sample 6000 Featurizing sample 7000 Featurizing sample 8000 TIMING: featurizing shard 2 took 19. 106 s Loading shard 4 of size 8192. Featurizing sample 0 Featurizing sample 1000 Featurizing sample 2000 Featurizing sample 3000 Featurizing sample 4000 Featurizing sample 5000 Featurizing sample 6000 Featurizing sample 7000 Featurizing sample 8000 TIMING: featurizing shard 3 took 16. 267 s Loading shard 5 of size 8192. Featurizing sample 0 Featurizing sample 1000 Featurizing sample 2000 Featurizing sample 3000 Featurizing sample 4000 Featurizing sample 5000 Featurizing sample 6000 Featurizing sample 7000 Featurizing sample 8000 TIMING: featurizing shard 4 took 16. 754 s Loading shard 6 of size 8192. Featurizing sample 0 TIMING: featurizing shard 5 took 0. 446 s TIMING: dataset construction took 98. 214 s Loading dataset from disk. TIMING: dataset construction took 21. 127 s Loading dataset from disk. /home/leswing/miniconda3/envs/deepchem/lib/python3. 6/site-packages/tensorflow/python/ops/gradients_impl. py:100: User Warning: Converting sparse Indexed Slices to a dense Tensor of unknown shape. This may consume a large amount of memory. "Converting sparse Indexed Slices to a dense Tensor of unknown shape. " 2. Create Work-Units 1. Download All of ZINC15. Go to http://zinc15. docking. org/tranches/home and download all non-empty tranches in . smi format. I found it easiest to download the wget script and then run the wget script. For the rest of this tutorial I will assume zinc was downloaded to /tmp/zinc. The way zinc downloads the data isn't great for inference. We want "Work-Units" which a single CPU can execute that takes a resonable amount of time (10 minutes to an hour). To accomplish this we are going to split the zinc data into | deepchem.pdf |
files each with 500 thousand lines. mkdir /tmp/zinc/screen find /tmp/zinc -name '*. smi' -exec cat {} \; | grep -iv "smiles" \ | split -l 500000 /tmp/zinc/screen/segment This bash command 1. Finds all smi files 2. prints to stdout the contents of the file 3. removes header lines 4. splits into multiple files in /tmp/zinc/screen that are 500k molecules long 3. Creat Inference Script Now that we have work unit we need to construct a program which ingests a work unit and logs the result. It is important that the logging mechanism is thread safe! For this example we will get the work unit via a file-path, and log the result to a file. An easy extensions to distribute over multiple computers would be to get the work unit via a url, and log the results to a distributed queue. Here is what mine looks like inference. py import sys import deepchem as dc import numpy as np from rdkit import Chem import pickle import os def create_dataset ( fname, batch_size = 50000 ): featurizer = dc. feat. Conv Mol Featurizer () fin = open ( fname ) mols, orig_lines = [], [] for line in fin : line = line. strip (). split () try : mol = Chem. Mol From Smiles ( line [ 0 ]) if mol is None : continue mols. append ( mol ) orig_lines. append ( line ) except : pass if len ( mols ) > 0 and len ( mols ) % batch_size == 0 : features = featurizer. featurize ( mols ) y = np. ones ( shape = ( len ( mols ), 1 )) ds = dc. data. Numpy Dataset ( features, y ) yield ds, orig_lines mols, orig_lines = [], [] if len ( mols ) > 0 : features = featurizer. featurize ( mols ) y = np. ones ( shape = ( len ( mols ), 1 )) ds = dc. data. Numpy Dataset ( features, y ) yield ds, orig_lines def evaluate ( fname ): fout_name = " %s _out. smi" % fname model = dc. models. Tensor Graph. load_from_dir ( 'screen_model' ) for ds, lines in create_dataset ( fname ): y_pred = np. squeeze ( model. predict ( ds ), axis = 1 ) with open ( fout_name, 'a' ) as fout : for index, line in enumerate ( lines ): line. append ( y_pred [ index ][ 1 ]) line = [ str ( x ) for x in line ] line = " \t ". join ( line ) fout. write ( " %s \n " % line ) if __name__ == "__main__" : evaluate ( sys. argv [ 1 ]) | deepchem.pdf |
4. Load "Work-Unit" into a "Work Queue" We are going to use a flat file as our distribution mechanism. It will be a bash script calling our inference script for every work unit. If you are at an academic institution this would be queing your jobs in pbs/qsub/slurm. An option for cloud computing would be rabbitmq or kafka. import os work_units = os. listdir ( '/tmp/zinc/screen' ) with open ( '/tmp/zinc/work_queue. sh', 'w' ) as fout : fout. write ( "#!/bin/bash \n " ) for work_unit in work_units : full_path = os. path. join ( '/tmp/zinc', work_unit ) fout. write ( "python inference. py %s " % full_path ) 5. Consume work units from "distribution mechanism" We will consume work units from our work queue using a very simple Process Pool. It takes lines from our "Work Queue" and runs them, running as many processes in parrallel as we have cpus. If you are using a supercomputing cluster system like pbs/qsub/slurm it will take care of this for you. The key is to use one CPU per work unit to get highest throughput. We accomplish that here using the linux utility "taskset". Using an c5. 18xlarge on aws this will finish overnight. process_pool. py import multiprocessing import sys from multiprocessing. pool import Pool import delegator def run_command ( args ): q, command = args cpu_id = q. get () try : command = "taskset -c %s %s " % ( cpu_id, command ) print ( "running %s " % command ) c = delegator. run ( command ) print ( c. err ) print ( c. out ) except Exception as e : print ( e ) q. put ( cpu_id ) def main ( n_processors, command_file ): commands = [ x. strip () for x in open ( command_file ). readlines ()] commands = list ( filter ( lambda x : not x. startswith ( "#" ), commands )) q = multiprocessing. Manager (). Queue () for i in range ( n_processors ): q. put ( i ) argslist = [( q, x ) for x in commands ] pool = Pool ( processes = n_processors ) pool. map ( run_command, argslist ) if __name__ == "__main__" : processors = multiprocessing. cpu_count () main ( processors, sys. argv [ 1 ]) >> python process_pool. py /tmp/zinc/work_queue. sh 6. Gather Results Since we logged our results to *_out. smi we now need to gather all of them up and sort them by our predictions. The resulting file wile be > 40GB. To analyze the data further you can use dask, or put the data in a rdkit postgres cartridge. Here I show how to join the and sort the data to get the "best" results. find /tmp/zinc -name '*_out. smi' -exec cat {} \; > /tmp/zinc/screen/results. smi sort -rg -k 3,3 /tmp/zinc/screen/results. smi > /tmp/zinc/screen/sorted_results. smi # Put the top 100k scoring molecules in their own file head -n 50000 /tmp/zinc/screen/sorted_results. smi > /tmp/zinc/screen/top_100k. smi | deepchem.pdf |
/tmp/zinc/screen/top_100k. smi is now a small enough file to investigate using standard tools like pandas. from rdkit import Chem from rdkit. Chem. Draw import IPython Console from IPython. display import SVG from rdkit. Chem. Draw import rd Mol Draw2D best_mols = [ Chem. Mol From Smiles ( x. strip (). split ()[ 0 ]) for x in open ( '/tmp/zinc/screen/top_100k. smi' ). readlines ()[: 100 best_scores = [ x. strip (). split ()[ 2 ] for x in open ( '/tmp/zinc/screen/top_100k. smi' ). readlines ()[: 100 ]] print ( best_scores [ 0 ]) best_mols [ 0 ] 0. 98874843 print ( best_scores [ 0 ]) best_mols [ 1 ] 0. 98874843 print ( best_scores [ 0 ]) best_mols [ 2 ] 0. 98874843 print ( best_scores [ 0 ]) best_mols [ 3 ] 0. 98874843 The screen seems to favor molecules with one or multiple sulfur trioxides. The top scoring molecules also have low diversity. When creating a "buy list" we want to optimize for more things than just activity, for instance diversity and drug like MPO. #We use the code from https://github. com/Pat Walters/rd_filters, detailed explanation is here: http://practicalcheminformatics. blogspot. com/2018/08/filtering-chemical-libraries. html #We will run the PAINS filter on best_mols as suggested by Issue 1355 (https://github. com/deepchem/deepchem/issues/1355) import os import pandas as pd from rdkit import Chem from rdkit. Chem. Descriptors import Mol Wt, Mol Log P, Num HDonors, Num HAcceptors, TPSA from rdkit. Chem. rd Mol Descriptors import Calc Num Rotatable Bonds #First we get the rules from alert_collection. csv and then filter to get PAINS filter rule_df = pd. read_csv ( os. path. join ( os. path. abspath ( '' ), 'assets', 'alert_collection. csv' )) rule_df = rule_df [ rule_df [ 'rule_set_name' ] == 'PAINS' ] rule_list = [] for rule_id, smarts, max_val, desc in rule_df [[ "rule_id", "smarts", "max", "description" ]]. values. tolist (): | deepchem.pdf |
smarts_mol = Chem. Mol From Smarts ( smarts ) if smarts_mol : rule_list. append (( smarts_mol, max_val, desc )) def evaluate ( smile ): mol = Chem. Mol From Smiles ( smile ) if mol is None : return [ smile, "INVALID", -999, -999, -999, -999, -999, -999 ] desc_list = [ Mol Wt ( mol ), Mol Log P ( mol ), Num HDonors ( mol ), Num HAcceptors ( mol ), TPSA ( mol ), Calc Num Rotatable Bonds ( mol for patt, max_val, desc in rule_list : if len ( mol. Get Substruct Matches ( patt )) > max_val : return [ smiles, desc + " > %d " % ( max_val )] + desc_list return [ smiles, "OK" ] + desc_list smiles = [ x. strip (). split ()[ 0 ] for x in open ( '/tmp/zinc/screen/top_100k. smi' ). readlines ()[: 100 ]] # obtain the smiles res = list ( map ( evaluate, smiles )) # here we apply the PAINS filter df = pd. Data Frame ( res, columns = [ "SMILES", "FILTER", "MW", "Log P", "HBD", "HBA", "TPSA", "Rot" ]) df_ok = df [ ( df. FILTER == "OK" ) & df. MW. between ( * [ 0, 500 ]) & # MW df. Log P. between ( * [-5, 5 ]) & #Log P df. HBD. between ( * [ 0, 5 ]) & #HBD df. HBA. between ( * [ 0, 10 ]) & #HBA df. TPSA. between ( * [ 0, 200 ]) & #TPSA df. Rot. between ( * [ 0, 10 ]) #Rot ] Congratulations! Time to join the Community! Congratulations on completing this tutorial notebook! If you enjoyed working through the tutorial, and want to continue working with Deep Chem, we encourage you to finish the rest of the tutorials in this series. You can also help the Deep Chem community in the following ways: Star Deep Chem on Git Hub This helps build awareness of the Deep Chem project and the tools for open source drug discovery that we're trying to build. Join the Deep Chem Gitter The Deep Chem Gitter hosts a number of scientists, developers, and enthusiasts interested in deep learning for the life sciences. Join the conversation! | deepchem.pdf |
Introduction to the Molecular Attention Transformer. In this tutorial we will learn more about the Molecular Attention Transformer, or MAT. MAT is a model based on transformers, aimed towards performing molecular prediction tasks. MAT is easy to tune and performs quite well relative to other molecular prediction tasks. The weights from MAT are chemically interpretable, thus making the model quite useful. Reference Paper: Molecular Attention Transformer, Maziarka et. al. In this tutorial, we will explore how to train MAT, and predict hydration enthalpy values for molecules from the freesolv hydration enthalpy dataset with MAT. Colab This tutorial and the rest in this sequence are designed to be done in Google colab. If you'd like to open this notebook in colab, you can use the following link. O p e n i n C o l a b O p e n i n C o l a b ! pip install --pre deepchem Import required modules import deepchem as dc from deepchem. models. torch_models import MATModel from deepchem. feat import MATFeaturizer import matplotlib. pyplot as plt wandb : WARNING W&B installed but not logged in. Run `wandb login` or set the WANDB_API_KEY env variable. wandb : WARNING W&B installed but not logged in. Run `wandb login` or set the WANDB_API_KEY env variable. Molecule Featurization using MATFeaturizer MATFeaturizer is the featurizer intended to be used with the Molecular Attention Transformer, or MAT in short. MATFeaturizer takes a smile string or molecule as input and returns a MATEncoding dataclass object, which contains 3 numpy arrays: Node features matrix, adjacency matrix and distance matrix. featurizer = dc. feat. MATFeaturizer () # Let us now take an example array of smile strings and featurize it. smile_string = [ "CCC" ] output = featurizer. featurize ( smile_string ) print ( type ( output [ 0 ])) print ( output [ 0 ]. node_features ) print ( output [ 0 ]. adjacency_matrix ) print ( output [ 0 ]. distance_matrix ) <class 'deepchem. feat. molecule_featurizers. mat_featurizer. MATEncoding'> [[1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] [0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. ] [0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. ] [0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. ]] [[0. 0. 0. 0. ] [0. 0. 0. 1. ] [0. 0. 0. 1. ] [0. 1. 1. 0. ]] [[1. e+06 1. e+06 1. e+06 1. e+06] [1. e+06 0. e+00 2. e+00 1. e+00] [1. e+06 2. e+00 0. e+00 1. e+00] [1. e+06 1. e+00 1. e+00 0. e+00]] Getting the Freesolv Hydration enthalpy dataset We will now acquire the Freesolv Hydration enthalpy dataset from Molecule Net. If it already exists in the directory, the file will be used. Else, deepchem will automatically download the dataset from its AWS bucket. tasks, dataset, transformers = dc. molnet. load_freesolv () | deepchem.pdf |
train_dataset, val_dataset, test_dataset = dataset train_smiles = train_dataset. ids val_smiles = val_dataset. ids train_dataset <Disk Dataset X. shape: (513,), y. shape: (513, 1), w. shape: (513, 1), ids: ['CCCCNCCCC' 'CCOC=O' 'CCCCCCCCC' ... 'COC' 'CCCCCCCCBr' 'CCCc1ccc(c(c1)OC)O'], task_names: ['y']> Training the model Now that we have acquired the dataset to be used and made the necessary imports, we will be invoking the Molecular Attention Transformer in deepchem, called MATModel, and we will be training it. We will be using default parameters for the purposes of this tutorial, however they can be changed anytime according to the user's preferences. device = 'cpu' model = MATModel ( device = device ) losses, val_losses = [], [] %%time max_epochs = 10 for epoch in range ( max_epochs ): loss = model. fit ( train_dataset, nb_epoch = 1, max_checkpoints_to_keep = 1, all_losses = losses ) metric = dc. metrics. Metric ( dc. metrics. score_function. rms_score ) val_losses. append ( model. evaluate ( val_dataset, metrics = [ metric ])[ 'rms_score' ] ** 2 ) # The warnings are not relevant to this tutorial thus we can safely skip them. /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:165: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). node_features = torch. tensor(data[0]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:166: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). adjacency_matrix = torch. tensor(data[1]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:167: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(data[2]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:171: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). torch. sum(torch. tensor(adj_matrix), dim=-1). unsqueeze(2) + eps) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:178: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(distance_matrix). squeeze(). masked_fill( /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:165: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). node_features = torch. tensor(data[0]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:166: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). adjacency_matrix = torch. tensor(data[1]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:167: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(data[2]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:171: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). torch. sum(torch. tensor(adj_matrix), dim=-1). unsqueeze(2) + eps) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:178: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(distance_matrix). squeeze(). masked_fill( /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:165: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). node_features = torch. tensor(data[0]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:166: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). adjacency_matrix = torch. tensor(data[1]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:167: User Warning: To copy construct from a | deepchem.pdf |
tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(data[2]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:171: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). torch. sum(torch. tensor(adj_matrix), dim=-1). unsqueeze(2) + eps) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:178: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(distance_matrix). squeeze(). masked_fill( /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:165: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). node_features = torch. tensor(data[0]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:166: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). adjacency_matrix = torch. tensor(data[1]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:167: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(data[2]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:171: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). torch. sum(torch. tensor(adj_matrix), dim=-1). unsqueeze(2) + eps) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:178: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(distance_matrix). squeeze(). masked_fill( /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:165: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). node_features = torch. tensor(data[0]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:166: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). adjacency_matrix = torch. tensor(data[1]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:167: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(data[2]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:171: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). torch. sum(torch. tensor(adj_matrix), dim=-1). unsqueeze(2) + eps) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:178: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(distance_matrix). squeeze(). masked_fill( /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:165: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). node_features = torch. tensor(data[0]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:166: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). adjacency_matrix = torch. tensor(data[1]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:167: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(data[2]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:171: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). torch. sum(torch. tensor(adj_matrix), dim=-1). unsqueeze(2) + eps) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:178: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(distance_matrix). squeeze(). masked_fill( /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:165: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). node_features = torch. tensor(data[0]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:166: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). adjacency_matrix = torch. tensor(data[1]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:167: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(data[2]). float() | deepchem.pdf |
/home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:171: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). torch. sum(torch. tensor(adj_matrix), dim=-1). unsqueeze(2) + eps) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:178: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(distance_matrix). squeeze(). masked_fill( /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:165: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). node_features = torch. tensor(data[0]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:166: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). adjacency_matrix = torch. tensor(data[1]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:167: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(data[2]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:171: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). torch. sum(torch. tensor(adj_matrix), dim=-1). unsqueeze(2) + eps) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:178: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(distance_matrix). squeeze(). masked_fill( /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:165: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). node_features = torch. tensor(data[0]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:166: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). adjacency_matrix = torch. tensor(data[1]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:167: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(data[2]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:171: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). torch. sum(torch. tensor(adj_matrix), dim=-1). unsqueeze(2) + eps) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:178: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(distance_matrix). squeeze(). masked_fill( /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:165: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). node_features = torch. tensor(data[0]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:166: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). adjacency_matrix = torch. tensor(data[1]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:167: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(data[2]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:171: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). torch. sum(torch. tensor(adj_matrix), dim=-1). unsqueeze(2) + eps) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:178: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(distance_matrix). squeeze(). masked_fill( CPU times: user 20min 48s, sys: 10. 7 s, total: 20min 58s Wall time: 2min 41s f, ax = plt. subplots () ax. scatter ( range ( len ( losses )), losses, label = 'train loss' ) ax. scatter ( range ( len ( val_losses )), val_losses, label = 'val loss' ) plt. legend ( loc = 'upper right' ); | deepchem.pdf |
Testing the model Optimally, MAT should be trained for a lot more epochs with a GPU. Due to computational constraints, we train this model for very few epochs in this tutorial. Let us now see how to predict the hydration enthalpy values for molecues now with MAT. # We will be predicting the enthalpy value for the smile string we featurized earlier in the MATFeaturizer section. model. predict_on_batch ( output ) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:165: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). node_features = torch. tensor(data[0]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:166: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). adjacency_matrix = torch. tensor(data[1]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/mat. py:167: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_(T rue), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(data[2]). float() /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:171: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). torch. sum(torch. tensor(adj_matrix), dim=-1). unsqueeze(2) + eps) /home/atreyamaj/Desktop/deepchem/deepchem/models/torch_models/layers. py:178: User Warning: To copy construct from a tensor, it is recommended to use source Tensor. clone(). detach() or source Tensor. clone(). detach(). requires_grad_ (True), rather than torch. tensor(source Tensor). distance_matrix = torch. tensor(distance_matrix). squeeze(). masked_fill( array([[-0. 32668447]], dtype=float32) | deepchem.pdf |
Generating Molecules with Mol GAN In this tutorial, we will train a Mol GAN network on the tox21 dataset which is a dataset of 12,060 training samples and 647 test samples of chemical compounds. The Mol GAN network was first introduced in "Mol GAN: An implicit generative model for small molecular graphs" by Cao and Kipf. It uses a GAN directly on graph data and a reinforcement learning objective to induce the network to generate molecules with certain chemical properties. The architecture consits of 3 main sections: a generator, a discriminator, and a reward network. The generator takes a sample (z) from a standard normal distribution to generate an a graph using a MLP (this limits the network to a fixed maximum size) to generate the graph at once. Sepcifically a dense adjacency tensor A (bond types) and an annotation matrix X (atom types) are produced. Since these are probabilities, a discrete, sparse x and a are generated through categorical sampling. The discriminator and reward network have the same architectures and recieve graphs as inputs. A Relational-GCN and MLPs are used to produce the singular output. Colab This tutorial and the rest in this sequence are designed to be done in Google colab. If you'd like to open this notebook in colab, you can use the following link. O p e n i n C o l a b O p e n i n C o l a b Setup To run Deep Chem within Colab, you'll need to run the following cell of installation commands. ! pip install --pre deepchem import deepchem deepchem. __version__ Requirement already satisfied: deepchem in /usr/local/lib/python3. 10/dist-packages (2. 8. 1. dev20240603202041) Requirement already satisfied: joblib in /usr/local/lib/python3. 10/dist-packages (from deepchem) (1. 4. 2) Requirement already satisfied: numpy>=1. 21 in /usr/local/lib/python3. 10/dist-packages (from deepchem) (1. 25. 2) Requirement already satisfied: pandas in /usr/local/lib/python3. 10/dist-packages (from deepchem) (2. 0. 3) Requirement already satisfied: scikit-learn in /usr/local/lib/python3. 10/dist-packages (from deepchem) (1. 2. 2) Requirement already satisfied: sympy in /usr/local/lib/python3. 10/dist-packages (from deepchem) (1. 12. 1) Requirement already satisfied: scipy>=1. 10. 1 in /usr/local/lib/python3. 10/dist-packages (from deepchem) (1. 11. 4) Requirement already satisfied: rdkit in /usr/local/lib/python3. 10/dist-packages (from deepchem) (2023. 9. 6) Requirement already satisfied: python-dateutil>=2. 8. 2 in /usr/local/lib/python3. 10/dist-packages (from pandas->d eepchem) (2. 8. 2) Requirement already satisfied: pytz>=2020. 1 in /usr/local/lib/python3. 10/dist-packages (from pandas->deepchem) ( 2023. 4) Requirement already satisfied: tzdata>=2022. 1 in /usr/local/lib/python3. 10/dist-packages (from pandas->deepchem) (2024. 1) Requirement already satisfied: Pillow in /usr/local/lib/python3. 10/dist-packages (from rdkit->deepchem) (9. 4. 0) Requirement already satisfied: threadpoolctl>=2. 0. 0 in /usr/local/lib/python3. 10/dist-packages (from scikit-lear n->deepchem) (3. 5. 0) Requirement already satisfied: mpmath<1. 4. 0,>=1. 1. 0 in /usr/local/lib/python3. 10/dist-packages (from sympy->deep chem) (1. 3. 0) Requirement already satisfied: six>=1. 5 in /usr/local/lib/python3. 10/dist-packages (from python-dateutil>=2. 8. 2->pandas->deepchem) (1. 16. 0) | deepchem.pdf |
WARNING:deepchem. feat. molecule_featurizers. rdkit_descriptors:No normalization for SPS. Feature removed! WARNING:deepchem. feat. molecule_featurizers. rdkit_descriptors:No normalization for Avg Ipc. Feature removed! WARNING:tensorflow:From /usr/local/lib/python3. 10/dist-packages/tensorflow/python/util/deprecation. py:588: calli ng function (from tensorflow. python. eager. polymorphic_function. polymorphic_function) with experimental_relax_sha pes is deprecated and will be removed in a future version. Instructions for updating: experimental_relax_shapes is deprecated, use reduce_retracing instead WARNING:deepchem. models. torch_models:Skipped loading modules with pytorch-geometric dependency, missing a depend ency. No module named 'torch_geometric' WARNING:deepchem. models:Skipped loading modules with pytorch-geometric dependency, missing a dependency. cannot import name 'DMPNN' from 'deepchem. models. torch_models' (/usr/local/lib/python3. 10/dist-packages/deepchem/models /torch_models/__init__. py) WARNING:deepchem. models:Skipped loading modules with pytorch-lightning dependency, missing a dependency. No modu le named 'lightning' WARNING:deepchem. models:Skipped loading some Jax models, missing a dependency. No module named 'haiku' '2. 8. 1. dev' Import the packages you'll need. import numpy as np import matplotlib. pyplot as plt import pandas as pd import os from collections import Ordered Dict import deepchem as dc import deepchem. models import torch from deepchem. models. torch_models import Basic Mol GANModel as Mol GAN from deepchem. models. optimizers import Exponential Decay from torch. nn. functional import one_hot from rdkit import Chem from rdkit. Chem. Draw import IPython Console from rdkit. Chem import Draw from deepchem. feat. molecule_featurizers. molgan_featurizer import Graph Matrix Download, load, and extract the SMILES strings from the tox21 dataset. The original paper used the QM9 dataset, however we use the tox21 dataset here to save time. # Download from Mol Net # Try tox21 or LIPO dataset tasks, datasets, transformers = dc. molnet. load_tox21 () df = pd. Data Frame ( data = { 'smiles' : datasets [ 0 ]. ids }) Specify the maximum number of atoms to enocde for the featurizer and the Mol GAN network. The higher the number of atoms, the more data you'll have in the dataset. However, this also increases the model complexity as the input dimensions become higher. num_atoms = 12 df smiles 0 CC(O)(P(=O)(O)O)P(=O)(O)O 1 CC(C)(C)OOC(C)(C)CCC(C)(C)OOC(C)(C)C 2 OC[C@H](O)[C@@H](O)[C@H](O)CO 3 CCCCCCCC(=O)[O-]. CCCCCCCC(=O)[O-]. [Zn+2] 4 CC(C)COC(=O)C(C)C...... 6259 CC1CCCCN1CCCOC(=O)c1ccc(OC2CCCCC2)cc1 6260 Cc1cc(CCCOc2c(C)cc(-c3noc(C(F)(F)F)n3)cc2C)on1 6261 O=C1OC(OC(=O)c2cccnc2Nc2cccc(C(F)(F)F)c2)c2ccc... 6262 CC(=O)C1(C)CC2=C(CCCC2(C)C)CC1C 6263 CC(C)CCC[C@@H](C)[C@H]1CC(=O)C2=C3CC[C@H]4C[C@... 6264 rows Γ 1 columns Uncomment the first line if you want to subsample from the full dataset. | deepchem.pdf |
#data = df[['smiles']]. sample(4000, random_state=42) data = df Initialize the featurizer with the maxmimum number of atoms per molecule. atom_labels is a parameter to pass the atomic number of atoms you want to be able to parse. Similar to the num_atoms parameter above, more atom_labels means more data, though the model gets more complex/unstable. # create featurizer feat = dc. feat. Mol Gan Featurizer ( max_atom_count = num_atoms, atom_labels = [ 0, 5, 6, 7, 8, 9, 11, 12, 13, 14 ]) #15, 16, 17, 19, 20, 24, 29, 35, 53, 80]) Extract the smiles from the dataframe as a list of strings smiles = data [ 'smiles' ]. values Filter out the molecules with too many atoms to reduce the number of unnecessary error messages in later steps. filtered_smiles = [ x for x in smiles if Chem. Mol From Smiles ( x ). Get Num Atoms () < num_atoms ] [13:29:08] WARNING: not removing hydrogen atom without neighbors The next cell featurizes the filtered molecules, however, since we have limited the atomic numbers to [5, 6, 7, 8, 9, 11, 12, 13, 14] which is B, C, N, O, F, Na, Mg, Al and Si, the featurizer fails to featurize several molecules in the dataset. Feel free to experiment with more atomic numbers! # featurize molecules features = feat. featurize ( filtered_smiles ) WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 0, CC(O)(P(=O)(O)O)P(=O)(O)O. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 11, O=[N+]([O-])[O-]. O=[N+]([O-])[O-]. [Ca+2]. A ppending empty array WARNING:deepchem. feat. base_classes:Exception message: 20 [13:29:09] WARNING: not removing hydrogen atom without neighbors WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 12, F[B-](F)(F)F. [H+]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 1 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 14, [I-]. [K+]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 19 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 17, C=CC(=O)OCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 25, Cl COCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 27, [Cu]I. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 29 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 32, Br CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 33, CCC(Cl)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 39, NC(=S)NNC(N)=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 40, NC(=S)C(N)=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 41, C[Hg]Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 42, [Hg+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 80 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 44, O=[Cr](=O)([O-])O[Cr](=O)(=O)[O-]. Appendin g empty array WARNING:deepchem. feat. base_classes:Exception message: 24 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 45, O=P(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 49, O=C(O)[C@@H](S)[C@H](S)C(=O)O. Appending em pty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 54, CO[Si](CCCS)(OC)OC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 55, C[N+](C)=CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 57, O=[Zr](Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 61, CC(C)OS(C)(=O)=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 66, CC(Cl)(Cl)C(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 67, O=C(CCl)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 68, CC(=O)C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 71, Cl CCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 75, CCOP(=O)(CC)OCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 81, CSCCC=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 82, CC(C)(C)C(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 87, CNC(=O)NC(O)C(Cl)(Cl)Cl. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 90, Br CCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 91, O=S(=O)([O-])[O-]. [Na+]. [Na+]. Appending em pty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 98, [O-][Cl+3]([O-])([O-])[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 106, OCCSSCCO. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 107, Cl CCCCCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 108, [Ba+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 56 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 116, N#CSCSC#N. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 117, CCCCS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 118, C[S+](C)CCC(N)C(=O)O. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 120, CC[Sn](Br)(CC)CC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 122, CCOP(=O)(OCC)OCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 133, CCOP(=O)([O-])C(N)=O. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 136, COC(=O)CCS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 141, CCCCCCCCC(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 142, COC(=O)C(Cl)C(=O)OC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 146, NCCOS(=O)(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 147, COP(=O)(OC)OC=C(Cl)Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 148, CC(Cl)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 151, Cl[Pd]Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 152, Cl C=CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 153, O=S(=O)([O-])CCCS(=O)(=O)[O-]. Appending e mpty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 154, [Tl H2+]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 81 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 155, CC(C)OP(C)(=O)OC(C)C. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 156, COS(=O)(=O)OC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 158, O=P([O-])([O-])OC(CO)CO. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 160, C=C(Cl)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 161, OC(CCl)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 162, Cl[Ba]Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 163, O=C(O)CP(=O)(O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 166, Cl C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 167, Cl C(Br)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 168, CC(C)(CCl)C(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 175, Cl C(Cl)=C(Cl)C(Cl)=C(Cl)Cl. Appending empt y array | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 180, Cl C(Cl)(Cl)C(Cl)(Cl)Cl. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 183, [Na]I. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 53 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 185, Br CCCCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 187, CCNC(=S)NCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 190, CC(C)(C)S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 193, Cl/C=C/Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 194, Cl/C=C\Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 195, OCC(Cl)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 200, O=[Mo](=O)=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 42 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 201, CCOP(=S)(S)OCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 203, O=[Cr]O[Cr]=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 24 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 204, [Cr+3]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 24 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 210, CCN(CC)C(=S)[S-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 211, C[Sn](C)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 212, CCCCC(CC)C(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 214, OCC(O)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 217, O=S(=O)(O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 218, [Fe+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 26 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 223, N#C[S-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 224, CC(N)=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 228, Cl C(Cl)(Br)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 230, CSCCCN=C=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 238, CCCC[Sn](Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 50 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 241, C=CCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 242, C=CCSSCCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 243, C=CCNC(N)=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 245, Cl[Nd](Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 246, [Co+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 27 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 250, CC(C)C(Br)C(=O)NC(N)=O. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 251, O=S(=O)([O-])CO. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 253, CC(=O)C(Cl)C(=O)N(C)C. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 259, O=C/C(Cl)=C(/Cl)C(=O)O. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 260, O=C(O)C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 263, C=CCSSCC=C. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 267, Cl[Yb](Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 271, NS(=O)(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 272, [Pb H2+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 82 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 276, OCC(S)CS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 279, CCCCNP(N)(N)=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 281, N[C@@H](CS)C(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 282, CC(C)(O)C(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 284, NC(CS)C(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 290, Cl CCN(CCCl)CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 291, O=C(Cl)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 293, [Fe+3]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 26 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 294, CCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 295, Cl C(Cl)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 305, CC(=O)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 310, [Cu+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 29 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 311, Cl CC(Cl)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 316, Cl[In](Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 322, [As]#[In]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 33 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 323, COS(=O)(=O)[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 332, O=S(=O)([O-])[O-]. [Li+]. [Li+]. Appending e mpty array WARNING:deepchem. feat. base_classes:Exception message: 3 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 333, CCCCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 334, CS(=O)(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 339, S=C([S-])NCCNC(=S)[S-]. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 341, C[N+]([O-])(CCCl)CCCl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 350, CCC(C)S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 355, O=C(O)C(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 356, FC(F)(Cl)C(F)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 357, Br C(Br)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 360, COP(=O)(NC(C)=O)SC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 362, Cl P(Cl)(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 370, OCC(CBr)(CBr)CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 371, O=C(O)C(Cl)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 372, O=C(O)C(Cl)(Cl)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 374, OCCCCCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 384, CSSC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 387, CC(Cl)(Cl)[N+](=O)[O-]. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 390, O=[N+]([O-])C(Br)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 395, CNC(=S)NN. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 397, O=S(=O)(O)C(F)(F)F. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 400, CCSCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 401, CCCC[Sn](Cl)(Cl)CCCC. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 50 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 403, [C-]#N. [Cu+]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 29 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 404, CC(C)(C)C(=O)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 405, CC(S)C(=O)NCC(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 406, [Cd+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 48 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 409, CC(CC(=O)Cl)CC(C)(C)C. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 411, CCCCCCCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 413, C[N+](C)(C)CC(O)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 417, CNC(=S)[S-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 418, [Sn H2+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 50 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 420, COC(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 422, C=CC(Cl)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 423, O=[Se]([O-])[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 34 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 424, CCOC(=S)[S-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 429, Cl[Au](Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 436, N#CCCC(Br)(C#N)CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 440, C=CC(=O)[O-]. C=CC(=O)[O-]. [Zn+2]. Appendin g empty array WARNING:deepchem. feat. base_classes:Exception message: 30 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 441, CCCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 442, OCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 444, CCCCCCI. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 53 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 445, CCOC(=O)CS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 448, CC(C)S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 449, COP(OC)OC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 450, C[Se]CC[C@H](N)C(=O)O. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 34 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 454, CN(C)C(=S)SC(=S)N(C)C. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 455, OC[P+](CO)(CO)CO. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 457, CCOS(C)(=O)=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 458, COP(=O)(O)OP(=O)(O)OC. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 463, OCC(Br)=C(Br)CO. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 464, Br C/C=C/CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 465, N#CC(Br)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 467, O=S(=O)([O-])CCO. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 471, COP(=O)(O)OC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 473, Cl[Dy](Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 474, CCCSSCCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 477, N#CC(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 478, NC(=O)C(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 479, CC(=O)C(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 481, CCCCCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 487, OC(O)C(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 491, Cl C(Cl)=C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 492, Cl C(Cl)C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 493, Cl CC(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 495, OCCSCSCCO. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 497, CC(=O)[O-]. [K+]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 19 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 498, O=C(Cl)C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 501, CCSC(=O)N(CC)CC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 504, Cl CCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 506, CC(Br)C(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 507, S=C([S-])NCCNC(=S)[S-]. [Mn+2]. Appending e mpty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 511, COS(=O)(=O)C(F)(F)F. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 515, O=C(O)C(Br)CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 516, O=C([O-])CS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 518, O=S(=O)([O-])SSS(=O)(=O)[O-]. Appending em pty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 520, Br CC(Br)(Br)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 521, Br C(Br)C(Br)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 523, CS(C)=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 532, CNC(=S)N(C)C. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 534, CSC(C)(C)/C=N\O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 535, N#CCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 536, Cl CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 537, O=C([O-])C(=O)[O-]. [Ca+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 20 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 539, Br[Ca]Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 543, CN(C)C(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 544, C#CC(O)(/C=C\Cl)CC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 545, Br CCCCCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 546, O=P([O-])(O)C(Cl)(Cl)P(=O)([O-])O. Appendi ng empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 548, CCOP(OCC)OCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 551, CCCCCCC(C)(C)S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 559, CC(CCl)OC(C)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 560, OCC(CO)(CBr)CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 561, Cl CCOCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 563, OCCSCCO. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 564, P#[In]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 565, O=C(O)CC(Cl)C(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 566, N#CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 567, [Mn+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 25 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 568, CCC(C)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 571, FC(F)OC(Cl)C(F)(F)F. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 576, CC(O)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 578, O=C([O-])[O-]. [Sr+2]. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 38 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 583, CNC(=S)NC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 586, Cl CCOCCOCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 588, CC(C)(C)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 593, N#CC(Br)(Br)C(N)=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 597, CCCC(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 598, CCCCCCCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 599, CC(Cl)C(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 600, O=C(O)CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 601, O=[Cd]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 48 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 604, S=C=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 608, Cl C(Cl)(Cl)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 611, O=S(=O)([O-])CCS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 613, OCC(O)CS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 614, [Be+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 4 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 615, CC(C)(C)C(=O)C(Cl)Cl. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 617, O=S(=O)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 618, O=S(=O)([O-])[S-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 621, O=C([O-])CC(S[Au])C(=O)[O-]. Appending emp ty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 629, CN(C)CCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 637, O=P(O)(O)CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 638, N#CC(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 639, NC(=O)C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 640, CC(Cl)C(C)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 644, CC=C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 648, CC(=O)N[C@@H](CS)C(=O)O. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 650, [Zn+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 30 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 651, O=C(O)C(Br)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 653, O=C([O-])P(=O)([O-])[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 654, C=CS(=O)(=O)[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 660, CCCCCCCCCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 666, O=S(=O)([O-])O. [Na+]. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 668, Cl SC(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 671, N[Pt](N)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 678, CCCCSCCCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 681, COS(C)(=O)=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 686, O=C(O)CS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 688, NCCS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 692, CN(C)P(=O)(N(C)C)N(C)C. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 694, CC(C)OC(=S)[S-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 695, CNC(=O)ON=C(C)SC. Appending empty array | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 698, CCOC(=O)CC(=O)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 699, Br C(Br)C(Br)(Br)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 700, CCOC(=O)CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 702, O=[Se](O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 34 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 703, NC(N)=[Se]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 34 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 704, CCOS(=O)(=O)OCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 707, C=C(C)CS(=O)(=O)[O-]. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 708, CCCCOC(=O)CCS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 712, CNCCS(=O)(=O)[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 713, FC(F)(F)C(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 715, CCSCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 716, Cl CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 718, OCC(Br)CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 721, C[Si](C)(C)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 722, O=[N+]([O-])C(Cl)(Cl)Cl. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 724, O=C([O-])CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 728, CCCCCSCCCCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 729, CC(S)C(C)S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 732, O=[Se]=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 34 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 733, Cl C/C=C\CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 735, COC(=O)CS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 737, Cl CC(Br)CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 738, Br CCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 740, IC(I)=C(I)I. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 53 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 741, O=P([O-])([O-])OP(=O)([O-])[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 742, CCCCCCCCS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 743, N#CCCSCCC#N. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 745, CC(Cl)C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 748, CC(C)(Cl)[N+](=O)[O-]. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 749, CCC(Cl)[N+](=O)[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 751, N#CSCC(=O)[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 755, CCN(CC)CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 757, C=CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 758, C=CCN=C=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 761, O=S([O-])CO. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 762, CCSC(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 764, CCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 765, OCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 767, C=CCOCC(O)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 770, C/C(Cl)=C/CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 771, Br CCCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 773, O=S(=O)([O-])OOS(=O)(=O)[O-]. Appending em pty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 774, O=[N+]([O-])[O-]. [K+]. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 19 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 778, CS(=O)(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 780, Cl[Zn]Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 781, Cl C/C=C/CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 782, CC(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 783, Cl CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 786, NC(=S)NC(N)=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 788, CC(Cl)(Cl)C(=O)[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 790, C[Sn](Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 791, C=CCS(=O)(=O)[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 792, O=P([O-])(O)OP(=O)([O-])O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 799, CN(CCCl)CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 800, C=C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 801, N=C(N)S(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 803, COP(=O)(OC)OC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 810, Cl[Ni]Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 812, CCOP(=S)(Cl)OCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 813, COP(=S)(Cl)OC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 814, Cl[Sb](Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 823, O=[Bi]Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 824, [Br-]. [Na+]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 839, N[C@@H](CSCCCO)C(=O)O. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 843, O=C(C(Cl)(Cl)Cl)C(Cl)(Cl)Cl. Appending emp ty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 849, II. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 53 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 853, O=C([O-])C(Cl)(Cl)Cl. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 857, O=C(CCl)NCO. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 858, SCCCCCCCCS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 861, O=C(Cl)C(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 863, COC(=O)C(Br)CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 864, Cl C=C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 865, Cl CC(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 866, CC(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 867, FC(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 873, Cl CC=CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 877, CCOC(=O)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 879, O=S(=O)([O-])CC(S)CS. Appending empty arra | deepchem.pdf |
y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 880, O=[N+]([O-])C(Br)(CO)CO. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 881, OCCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 888, SCCSCCS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 889, CCCCNC(C)(C)[PH](=O)O. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 892, CCCCCCCCSCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 893, O=S(O)CO[Na]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 901, COP(C)(=O)OC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 906, [Ca+2]. [Cl-]. [Cl-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 911, CCCCCCCCCI. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 53 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 918, CC(C)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 919, [Sb H6+3]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 51 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 920, Cl C(Cl)C(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 924, O=P(O)(O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 927, NC(CSCC(=O)O)C(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 935, [Ni+2]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 28 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 940, O=[N+]([O-])O[Cd]O[N+](=O)[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 48 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 941, CCOC(=O)C(Cl)C(C)=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 943, N[C@@H](COP(=O)(O)O)C(=O)O. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 947, NNC(N)=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 948, CCCCS(=O)(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 949, NC(N)=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 952, CCOC(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 953, NS(=O)(=O)[O-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 954, COCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 955, C[N+](C)(C)CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 956, CCOP(=S)(OCC)SCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 958, CC[Ge](Cl)(CC)CC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 960, CCOP(=O)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 967, O=C(O)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 968, CCC(C)SSC(C)CC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 970, O=CC(Br)(Br)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 971, O=C(O)C(Br)(Br)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 978, C=C(C)CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 979, CCCCCCCCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 981, CCCCC(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 982, CCC(C)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 983, N#CCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 984, O=C(O)CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 985, O=S(=O)(O)F. Appending empty array | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 986, O=C(Cl)/C=C/C(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 987, O=C(O)CCSCCC(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 989, FC(F)(F)C(Cl)Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 991, O=C(O)CCS. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 997, CO[PH](=O)OC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1003, CC(C)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1008, O=C(CCl)C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1009, N#C[Au-]C#N. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 79 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1010, Cl C(Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1013, CCCSC(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1017, Cl C=CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1018, NCCS(=O)(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1020, CCOP(O)OCC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1022, FC(Cl)(Cl)C(F)(Cl)Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1024, CN(C)C(=S)[S-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1027, Cl CCCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1028, CCCCCCCCCBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1029, Cl[Sn](Cl)(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1030, CC(O)(P(=O)([O-])O)P(=O)([O-])O. Appendin g empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1041, CN=C=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1043, Cl CCOCOCCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1044, CC(C)=CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1046, C[As](C)(=O)O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 33 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1049, COP(N)(=O)SC. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1053, O=S(=O)([O-])CC(O)CCl. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1054, CC(C)CP(=S)([S-])CC(C)C. Appending empty array W ARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1060, O=C([O-])C(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1061, NNC(=S)NN. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1065, O=CCCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1067, CCCCCCCC(=O)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1070, NC(=O)CI. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 53 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1071, IC(I)I. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 53 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1073, FC(F)OC(F)(F)C(F)Cl. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1074, O=S(=O)(Cl)c1ccccc1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1077, Nc1ccccc1S(=O)(=O)O. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1087, Cl Cc1ccc(Cl)cc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1090, Nc1cc(Cl)cc(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1093, Cl Cc1ccccc1Cl. Appending empty array | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1094, COc1cccc(Br)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1099, Cc1cc(O)cc(C)c1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1100, S=C=NCc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1101, Cl Cc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1107, O=C=Nc1ccc(Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1109, O=Cc1ccc(Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1111, O=Cc1cccc(Br)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1117, O=C(CBr)c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1120, Cc1cc(C)c(N)c(Cl)c1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1122, NC(=S)Nc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1127, O=C(Cl)c1cc(Cl)cc(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1128, COC(=O)c1cccc(Cl)c1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1129, O=[N+]([O-])c1ccc(Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1135, [S-]c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1137, O=[N+]([O-])c1cccc(Cl)c1Cl. Appending emp ty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1138, O=[N+]([O-])c1ccc(Cl)cc1Cl. Appending emp ty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1139, O=[N+]([O-])c1ccc(Cl)c(Cl)c1. Appending e mpty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1140, Oc1ccc(Cl)c(Cl)c1Cl. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1141, Oc1cc(Cl)cc(Cl)c1Cl. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1142, Oc1c(Cl)ccc(Cl)c1Cl. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1143, Cl C(Cl)c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1146, O=C(Cl)OCc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1151, Br CCc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1153, Cc1c(Cl)cccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1159, Cc1ccccc1CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1164, Nc1ccc(Cl)cc1[N+](=O)[O-]. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1167, Oc1cccc(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1170, Cc1c(N)cccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1171, Cc1ccc(Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1172, Nc1ccc(Br)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1177, NC(=S)c1c(Cl)cccc1Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1178, Cc1ccccc1Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1184, Br CCOc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1185, Brc1ccc(Br)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1191, CCc1ccc(Br)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1196, Cc1ccccc1S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1199, O=S(=O)([O-])c1ccc(O)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1205, Nc1cc(Cl)c(N)c(Cl)c1. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1206, Oc1cc(Cl)c(Cl)cc1Cl. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1210, O=[N+]([O-])c1ccccc1CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1211, O=[N+]([O-])c1ccc(CCl)cc1. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1212, O=[N+]([O-])c1cccc(CCl)c1. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1213, Nc1ccc(S(=O)(=O)[O-])cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1221, CC(=O)O[Hg]c1ccccc1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 80 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1222, NC(=O)c1c(Cl)cccc1Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1223, Fc1cccc(Cl)c1CCl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1231, O=C(CCl)Nc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1232, COc1ccc(OC)c(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1233, Cc1cc(O)c(Cl)cc1C. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1237, O=C(Cl)c1c(Cl)cccc1Cl. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1241, Cc1ccc(S(=O)(=O)O)cc1. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1246, ICCc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 53 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1247, O=[N+]([O-])c1cccc(I)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 53 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1248, Nc1c(Cl)cc(Cl)cc1Cl. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1249, Nc1cc(Cl)ccc1O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1250, O=Cc1ccc(Cl)cc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1251, N#Cc1ccc(Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1252, COC(=O)c1ccccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1255, N#CSCc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1260, O=P(O)(O)c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1263, Clc1ccc(Cl)c(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1274, O=[N+]([O-])c1ccc(CBr)cc1. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1275, Cc1ccc(Br)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1277, COc1ccc(Br)c(C)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1278, Oc1c(Cl)c(Cl)cc(Cl)c1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1287, C=Cc1cccc(CCl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1288, Cc1cc(O)c(Cl)c(C)c1Cl. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1290, Cc1ccc(S(N)(=O)=O)cc1. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 16 | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1291, CCC(=O)c1ccc(Cl)cc1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1293, CC(C)(C)c1ccc(S)cc1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1294, O=C(CS)Nc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1306, Nc1cc(Cl)ccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1307, Nc1ccc(Cl)cc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1308, O=[PH](O)c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1313, O=C(CBr)c1ccc(O)cc1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1314, FC(F)(F)c1ccccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1315, O=C(O)c1cccc(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1316, O=C(O)c1ccccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1317, O=C(O)c1ccc(Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1318, FC(F)(F)c1cccc(Cl)c1. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1322, Cc1ccccc1S(=O)(=O)[O-]. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1324, O=Cc1cc(O)ccc1Br. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1326, Clc1ccc(C(Cl)(Cl)Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1328, Cc1cc(Cl)c(C)cc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1329, Clc1ccc(Cl)c(Cl)c1Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1330, [O-]c1cc(Cl)c(Cl)cc1Cl. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1332, COc1cc(Cl)ccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1333, CSc1ccc(C=O)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1338, Cc1ccc(Cl)c(O)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1344, Cc1cc(O)ccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1345, Oc1c(Cl)cc(Cl)c(Cl)c1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1347, COc1ccc(Cl)cc1C. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1348, Cc1cc(Cl)ccc1N. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1351, Cc1ccc(Cl)cc1N. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1352, Cc1ccc(N)cc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1353, Nc1ccc(N)c(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1354, Nc1ccc(Cl)cc1N. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1355, Nc1ccc(Cl)c(N)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1356, O=[N+]([O-])c1ccccc1Cl. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1359, O=S(=O)(O)c1ccccc1O. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1361, N#Cc1c(Cl)cccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1363, Cl C(Cl)(Cl)c1ccccc1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1367, Sc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1368, Nc1cc(Cl)c(O)c(Cl)c1. Appending empty arr | deepchem.pdf |
ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1369, O=C(O)c1ccc([Hg]Cl)cc1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1370, Oc1ccccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1374, SCc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1379, Oc1cc(Cl)c(Cl)c(Cl)c1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1380, Nc1cc(C(=O)O)ccc1Cl. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1381, N#CCc1ccc(Cl)cc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1383, Cc1ccc(Cl)cc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1384, Oc1cc(O)c(Cl)cc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1391, Oc1ccccc1[Hg]Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 80 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1394, O=S(=O)(O)c1ccc(Cl)cc1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1395, O=C(O)c1cc(Cl)cc(Cl)c1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1396, O=C(O)c1ccc(Cl)c(Cl)c1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1398, Clc1c(Cl)cc(Cl)c(Cl)c1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1402, Oc1ccc(Cl)c(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1403, Oc1c(Cl)cccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1405, Oc1cc(Cl)cc(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1406, Oc1cccc(Cl)c1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1407, Clc1cccc(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1408, Oc1cc(Cl)ccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1411, CC(C)(CCl)c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1419, Nc1ccc([As](=O)([O-])O)cc1. Appending emp ty array WARNING:deepchem. feat. base_classes:Exception message: 33 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1421, O=Cc1ccc(F)c(Br)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1422, O=C(O)c1ccccc1S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1423, O=P(O)(O)Oc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1429, Oc1cc(Cl)c(Cl)c(Cl)c1. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1436, Nc1ccc(Cl)c(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1437, Nc1ccccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1438, FC(F)(F)c1ccc(Cl)cc1. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1440, Nc1ccc([As](=O)(O)O)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 33 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1443, OCCSc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1446, O=S(=O)(O)c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1450, Sc1ccccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1451, Cl Cc1cccc(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1453, Fc1cc(Br)ccc1CBr. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1455, Oc1ccc(Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1457, Cc1c(Cl)cccc1[N+](=O)[O-]. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1460, Cc1cc(Cl)ccc1O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1462, Clc1ccccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1463, Clc1ccc(Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1464, Oc1ccc(Cl)cc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1465, Brc1ccc(Br)c(Br)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1471, N#Cc1cc(Br)c(O)c(Br)c1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1472, N#Cc1cc(I)c(O)c(I)c1. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 53 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1477, O=S(=O)(O)c1ccc(O)cc1. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1480, Clc1cc(Cl)cc(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1481, Clc1cccc(Cl)c1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1483, Clc1cc(Cl)c(Cl)c(Cl)c1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1484, Nc1ccc(S(N)(=O)=O)cc1. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1487, Nc1cccc(S(=O)(=O)O)c1. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1493, Brc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1498, Oc1c(Br)cc(Br)cc1Br. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1499, Oc1ccc([Hg]Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1500, Oc1c(Cl)cc(Cl)cc1Cl. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1501, O=C(Cl)c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1508, Clc1cc(Cl)c(Cl)c(Cl)c1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1509, Cc1cccc(Br)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1510, Br Cc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1513, O=[N+]([O-])c1cccc(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1517, Cl[Hg]c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1521, Cc1ccccc1S(N)(=O)=O. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1523, Cc1ccc(Cl)c(N)c1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1525, O=C(Cl)c1ccccc1F. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1530, Fc1ccc(Br)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1533, OCc1ccc(Cl)cc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1536, S=C=NCCc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1539, S=C=Nc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1541, Cc1ccc(S)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1542, CCCCc1ccc(Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1562, Cl P(Cl)c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1571, O=Cc1ccccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1572, Nc1cccc(Cl)c1. Appending empty array | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1580, CCC(=O)c1cccc(Cl)c1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1581, O=[N+]([O-])c1ccc(F)c(Cl)c1. Appending em pty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1585, OB(O)O[Hg]c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 80 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1588, Br Cc1ccc(Br)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1589, O=C(O)c1cc(Cl)ccc1Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1590, O=C(O)c1c(Cl)cccc1Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1591, O=C(O)c1cccc(Cl)c1Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1592, O=C(O)c1ccc(Cl)cc1Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1593, O=Cc1c(Cl)cccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1594, O=Cc1ccc(Cl)c(Cl)c1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1595, O=C(Cl)c1ccc(F)c(Cl)c1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1598, Cc1ccc([N+](=O)[O-])cc1Cl. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1599, Cc1cccc(Cl)c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1600, NS(=O)(=O)c1ccccc1Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1603, O=C=Nc1ccc(Cl)c(Cl)c1. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1605, NC(=O)Nc1ccc(Cl)cc1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1607, O=C(CCl)c1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1608, Nc1ccc(Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1609, CSc1ccc(Cl)cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1610, Cc1ccccc1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1611, CS(=O)(=O)c1ccc(Cl)cc1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1612, Clc1ccccc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1614, Oc1nc(Cl)c(Cl)cc1Cl. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1616, O=C(O)c1cccc(Cl)n1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1619, Clc1ccccn1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1620, Cl Cc1ccccn1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1621, Cl Cc1cccnc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1626, Clc1nc(Cl)c(Cl)c(Cl)c1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1629, Clc1cccc(C(Cl)(Cl)Cl)n1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1637, CCc1cc(C(N)=S)ccn1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1643, Clc1ccc(C(Cl)(Cl)Cl)cn1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1648, Cc1nc(C)c(Cl)c(O)c1Cl. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 | deepchem.pdf |
WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1649, O=C(O)c1nc(Cl)ccc1Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1659, Clc1cc(Cl)c(Cl)nc1Cl. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1665, Clc1cccc2ccccc12. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1668, Clc1ccc2ccccc2c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1674, O=S(=O)(O)NC1CCCCC1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1683, SC1CCCCC1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1694, O=S(=O)([O-])NC1CCCCC1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1706, CCn1cc[n+](C)c1. N#C[S-]. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1729, [O-][n+]1ccccc1[S-]. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1740, [O-][n+]1ccc(Cl)cc1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1748, Nc1nc(N)nc(Cl)n1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1750, Clc1nc(Cl)nc(Cl)n1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1751, CCNc1nc(N)nc(Cl)n1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1754, CC(C)SCc1ccco1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1758, Cc1c(S)cco1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1759, CSSc1c(C)occ1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1764, CC(=O)SCc1ccco1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1777, C[C@@H]1O[C@@H]1P(=O)([O-])[O-]. Appendin g empty array WARNING:deepchem. feat. base_classes:Exception message: 15 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1781, Br CC1CO1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1784, Cl CC1CO1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1787, Cc1nc2ccccc2s1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1788, Nc1nc2c(Cl)cccc2s1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1789, Sc1nc2ccccc2s1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1790, [S-]c1nc2ccccc2s1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1791, c1ccc2scnc2c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1792, Nc1nc2ccccc2s1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1793, Brc1nc2ccccc2s1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1794, Cc1cccc2c1nc(N)s2. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1795, Nc1nc(Cl)cc(Cl)n1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1796, Clc1cc(Cl)nc(Cl)n1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1797, Clc1ccnc(Cl)n1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1808, Cc1c(CCCl)scn1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1809, Cc1cscn1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1810, CC(C)Cc1nccs1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1811, Nc1nc(CC(=O)O)cs1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1812, Nc1nccs1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1813, CC(=O)c1nccs1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1814, Nc1ncc([N+](=O)[O-])s1. Appending empty a | deepchem.pdf |
rray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1815, Cc1c(CCO)scn1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1816, CCCc1ncc(C(=O)O)s1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1817, CC(=O)c1c(C)nc(C)s1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1849, Cc1nc2ccc(Cl)cc2[n H]1. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1870, O=C1CCCCC1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1892, CSc1cnccn1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1896, Cl CCN1CCCC1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1902, CC1(C)C(=O)N(Cl)C(=O)N1Br. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1903, CC1(C)C(=O)N(Br)C(=O)N1Cl. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1905, CC1(C)C(=O)N(Br)C(=O)N1Br. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1906, CC1(C)C(=O)N(Cl)C(=O)N1Cl. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1920, SC1CCCC1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1924, O=C1OC(O)C(C(Cl)Cl)=C1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1925, Clc1c(Cl)c(Cl)c(Cl)s1. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1926, c1cscc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1927, Cc1ccc(C=O)s1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1928, Cc1c(Br)ccs1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1933, O=c1[n H]c(=O)n(Cl)c(=O)n1Cl. Appending em pty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1937, O=[N+]([O-])C1(Br)COCOC1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 35 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1952, Clc1ccc2[n H]nnc2c1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1954, Oc1ccc(Cl)c2c1CCC2. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1962, O=c1[n H]c2cc(Cl)ccc2o1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1963, O=c1[n H]c2ccc(Cl)cc2o1. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1974, [Fe+2]. c1c[c H-]cc1. c1c[c H-]cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 26 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1975, [Ni+2]. c1c[c H-]cc1. c1c[c H-]cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 28 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1976, [Cr+2]. c1c[c H-]cc1. c1c[c H-]cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 24 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1977, [Co+2]. c1c[c H-]cc1. c1c[c H-]cc1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 27 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1982, Cn1sccc1=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1983, Cn1sc(Cl)cc1=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1984, Cc1c[n H]c(=S)[n H]c1=O. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1985, Cc1cc(=O)[n H]c(=S)[n H]1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1986, O=c1cc[n H]c(=S)[n H]1. Appending empty arr | deepchem.pdf |
ay WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1987, CCCc1cc(=O)[n H]c(=S)[n H]1. Appending empt y array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 1998, O=C1CCC(=O)N1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2008, Cl[C@H]1OCCO[C@@H]1Cl. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2011, Sc1nnc(S)s1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2024, O=c1[n H]sc2ccccc12. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2025, O=c1[n H]sc2cc(Cl)ccc12. Appending empty a rray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2033, Cn1cc[n H]c1=S. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2040, O=C1Cc2cc(Cl)ccc2N1. Appending empty arra y WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2058, Cl C1=C(Cl)C(Cl)(Cl)C(Cl)=C1Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2061, C1CSCCS1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2062, CC1(O)CSC(C)(O)CS1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2063, O=S1(=O)CCCC1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2064, O=S1(=O)CC(Cl)(Cl)C(Cl)(Cl)C1. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2065, CC(=O)NC1CCSC1=O. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2069, S=C1NC=NC2N=CNC12. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2070, S=c1[n H]c2ccccc2[n H]1. Appending empty ar ray WARNING:deepchem. feat. base_classes:Exception message: 16 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2072, O=c1cc(O)c(Cl)c[n H]1. Appending empty arr ay WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Failed to featurize datapoint 2079, Cl C1=C(Cl)C1(Cl)Cl. Appending empty array WARNING:deepchem. feat. base_classes:Exception message: 17 WARNING:deepchem. feat. base_classes:Exception message: setting an array element with a sequence. The requested ar ray has an inhomogeneous shape after 1 dimensions. The detected shape was (2081,) + inhomogeneous part. Remove more invalid molecules. indices = [ i for i, data in enumerate ( features ) if type ( data ) is Graph Matrix ] print ( indices ) features = [ features [ i ] for i in indices ] | deepchem.pdf |
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 13, 15, 16, 18, 19, 20, 21, 22, 23, 24, 26, 28, 29, 30, 31, 34, 35, 36, 37, 38, 43, 46, 47, 48, 50, 51, 52, 53, 56, 58, 59, 60, 62, 63, 64, 65, 69, 70, 72, 73, 74, 76, 77, 78, 79, 80, 83, 84, 85, 86, 88, 89, 92, 93, 94, 95, 96, 97, 99, 100, 101, 102, 103, 104, 105, 109, 110, 111, 112, 113, 114, 115, 119, 121, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 134, 135, 137, 138, 139, 140, 143, 144, 145, 149, 150, 157, 159, 164, 165, 169, 170, 171, 172, 173, 174, 176, 177, 179, 181, 182, 184, 186, 188, 189, 191, 192, 196, 19 7, 198, 199, 202, 205, 206, 207, 208, 209, 213, 215, 216, 219, 220, 221, 222, 225, 226, 227, 229, 231, 232, 233, 234, 235, 236, 237, 239, 240, 244, 247, 248, 249, 252, 254, 255, 256, 257, 258, 261, 262, 264, 265, 266, 268, 26 9, 270, 273, 274, 275, 277, 278, 280, 283, 285, 286, 287, 288, 289, 292, 296, 297, 298, 299, 300, 301, 302, 303, 304, 306, 307, 308, 309, 312, 313, 314, 315, 317, 318, 319, 320, 321, 324, 325, 326, 327, 328, 329, 330, 331, 33 5, 336, 337, 338, 340, 342, 343, 344, 345, 346, 347, 348, 349, 351, 352, 353, 354, 358, 359, 361, 363, 364, 365, 366, 367, 368, 369, 373, 375, 376, 377, 378, 379, 380, 381, 382, 383, 385, 386, 388, 389, 391, 392, 393, 394, 39 6, 398, 399, 402, 407, 408, 410, 412, 414, 415, 416, 419, 421, 425, 426, 427, 428, 430, 431, 432, 433, 434, 437, 438, 439, 443, 446, 447, 451, 452, 453, 456, 459, 460, 461, 462, 466, 468, 469, 470, 472, 475, 476, 480, 482, 48 3, 484, 485, 486, 488, 489, 490, 494, 499, 500, 502, 503, 505, 508, 509, 510, 512, 513, 514, 517, 519, 522, 524, 525, 526, 527, 528, 529, 530, 531, 533, 538, 540, 541, 542, 547, 549, 550, 552, 553, 554, 555, 556, 557, 558, 56 2, 569, 570, 572, 573, 574, 575, 577, 579, 580, 581, 582, 584, 585, 587, 589, 590, 591, 592, 594, 595, 596, 602, 603, 605, 606, 607, 609, 610, 612, 616, 619, 620, 622, 623, 624, 625, 626, 627, 628, 630, 631, 632, 633, 634, 63 5, 636, 641, 642, 643, 645, 646, 647, 649, 652, 655, 656, 657, 658, 659, 661, 662, 663, 664, 665, 667, 669, 670, 672, 673, 674, 675, 676, 677, 679, 680, 682, 683, 684, 687, 689, 690, 691, 693, 696, 697, 701, 705, 706, 709, 71 0, 711, 714, 717, 719, 720, 723, 725, 726, 727, 730, 731, 734, 736, 739, 744, 746, 747, 750, 752, 753, 754, 756, 759, 760, 763, 766, 768, 769, 772, 775, 776, 777, 779, 784, 785, 787, 789, 793, 794, 795, 796, 797, 798, 802, 80 4, 805, 806, 807, 808, 809, 811, 815, 816, 817, 818, 819, 820, 821, 822, 825, 826, 827, 828, 829, 830, 831, 832, 833, 834, 835, 836, 837, 838, 840, 841, 842, 844, 845, 846, 847, 848, 850, 851, 852, 854, 855, 856, 859, 860, 86 2, 868, 869, 870, 871, 872, 874, 875, 876, 878, 882, 883, 884, 885, 886, 887, 890, 891, 894, 895, 896, 897, 898, 899, 900, 902, 903, 904, 905, 907, 908, 909, 910, 912, 913, 914, 915, 916, 917, 921, 922, 923, 925, 926, 928, 92 9, 930, 931, 932, 933, 934, 936, 937, 938, 939, 942, 944, 945, 946, 950, 951, 957, 959, 961, 962, 963, 964, 965, 966, 969, 972, 973, 974, 975, 976, 977, 980, 988, 990, 992, 993, 994, 995, 996, 998, 999, 1000, 1001, 1002, 1004, 1005, 1006, 1007, 1011, 1012, 1014, 1015, 1016, 1019, 1021, 1023, 1025, 1026, 1031, 1032, 1033, 1034, 1035, 10 36, 1037, 1038, 1039, 1040, 1042, 1045, 1047, 1048, 1050, 1051, 1052, 1055, 1056, 1057, 1058, 1059, 1062, 1063, 1064, 1066, 1068, 1069, 1072, 1075, 1076, 1078, 1079, 1080, 1081, 1082, 1083, 1084, 1085, 1086, 1088, 1089, 1091, 1092, 1095, 1096, 1097, 1098, 1102, 1103, 1104, 1105, 1106, 1108, 1110, 1112, 1113, 1114, 1115, 1116, 1118, 11 19, 1121, 1123, 1124, 1125, 1126, 1130, 1131, 1132, 1133, 1134, 1136, 1144, 1145, 1147, 1148, 1149, 1150, 1152, 1154, 1155, 1156, 1157, 1158, 1160, 1161, 1162, 1163, 1165, 1166, 1168, 1169, 1173, 1174, 1175, 1176, 1179, 1180, 1181, 1182, 1183, 1186, 1187, 1188, 1189, 1190, 1192, 1193, 1194, 1195, 1197, 1198, 1200, 1201, 1202, 1203, 12 04, 1207, 1208, 1209, 1214, 1215, 1216, 1217, 1218, 1219, 1220, 1224, 1225, 1226, 1227, 1228, 1229, 1230, 1234, 1235, 1236, 1238, 1239, 1240, 1242, 1243, 1244, 1245, 1253, 1254, 1256, 1257, 1258, 1259, 1261, 1262, 1264, 1265, 1266, 1267, 1268, 1269, 1270, 1271, 1272, 1273, 1276, 1279, 1280, 1281, 1282, 1283, 1284, 1285, 1286, 1289, 12 92, 1295, 1296, 1297, 1298, 1299, 1300, 1301, 1302, 1303, 1304, 1305, 1309, 1310, 1311, 1312, 1319, 1320, 1321, 1323, 1325, 1327, 1331, 1334, 1335, 1336, 1337, 1339, 1340, 1341, 1342, 1343, 1346, 1349, 1350, 1357, 1358, 1360, 1362, 1364, 1365, 1366, 1371, 1372, 1373, 1375, 1376, 1377, 1378, 1382, 1385, 1386, 1387, 1388, 1389, 1390, 13 92, 1393, 1397, 1399, 1400, 1401, 1404, 1409, 1410, 1412, 1413, 1414, 1415, 1416, 1417, 1418, 1420, 1424, 1425, 1426, 1427, 1428, 1430, 1431, 1432, 1433, 1434, 1435, 1439, 1441, 1442, 1444, 1445, 1447, 1448, 1449, 1452, 1454, 1456, 1458, 1459, 1461, 1466, 1467, 1468, 1469, 1470, 1473, 1474, 1475, 1476, 1478, 1479, 1482, 1485, 1486, 14 88, 1489, 1490, 1491, 1492, 1494, 1495, 1496, 1497, 1502, 1503, 1504, 1505, 1506, 1507, 1511, 1512, 1514, 1515, 1516, 1518, 1519, 1520, 1522, 1524, 1526, 1527, 1528, 1529, 1531, 1532, 1534, 1535, 1537, 1538, 1540, 1543, 1544, 1545, 1546, 1547, 1548, 1549, 1550, 1551, 1552, 1553, 1554, 1555, 1556, 1557, 1558, 1559, 1560, 1561, 1563, 15 64, 1565, 1566, 1567, 1568, 1569, 1570, 1573, 1574, 1575, 1576, 1577, 1578, 1579, 1582, 1583, 1584, 1586, 1587, 1596, 1597, 1601, 1602, 1604, 1606, 1613, 1615, 1617, 1618, 1622, 1623, 1624, 1625, 1627, 1628, 1630, 1631, 1632, 1633, 1634, 1635, 1636, 1638, 1639, 1640, 1641, 1642, 1644, 1645, 1646, 1647, 1650, 1651, 1652, 1653, 1654, 16 55, 1656, 1657, 1658, 1660, 1661, 1662, 1663, 1664, 1666, 1667, 1669, 1670, 1671, 1672, 1673, 1675, 1676, 1677, 1678, 1679, 1680, 1681, 1682, 1684, 1685, 1686, 1687, 1688, 1689, 1690, 1691, 1692, 1693, 1695, 1696, 1697, 1698, 1699, 1700, 1701, 1702, 1703, 1704, 1705, 1707, 1708, 1709, 1710, 1711, 1712, 1713, 1714, 1715, 1716, 1717, 17 18, 1719, 1720, 1721, 1722, 1723, 1724, 1725, 1726, 1727, 1728, 1730, 1731, 1732, 1733, 1734, 1735, 1736, 1737, 1738, 1739, 1741, 1742, 1743, 1744, 1745, 1746, 1747, 1749, 1752, 1753, 1755, 1756, 1757, 1760, 1761, 1762, 1763, 1765, 1766, 1767, 1768, 1769, 1770, 1771, 1772, 1773, 1774, 1775, 1776, 1778, 1779, 1780, 1782, 1783, 1785, 17 86, 1798, 1799, 1800, 1801, 1802, 1803, 1804, 1805, 1806, 1807, 1818, 1819, 1820, 1821, 1822, 1823, 1824, 1825, 1826, 1827, 1828, 1829, 1830, 1831, 1832, 1833, 1834, 1835, 1836, 1837, 1838, 1839, 1840, 1841, 1842, 1843, 1844, 1845, 1846, 1847, 1848, 1850, 1851, 1852, 1853, 1854, 1855, 1856, 1857, 1858, 1859, 1860, 1861, 1862, 1863, 18 64, 1865, 1866, 1867, 1868, 1869, 1871, 1872, 1873, 1874, 1875, 1876, 1877, 1878, 1879, 1880, 1881, 1882, 1883, 1884, 1885, 1886, 1887, 1888, 1889, 1890, 1891, 1893, 1894, 1895, 1897, 1898, 1899, 1900, 1901, 1904, 1907, 1908, 1909, 1910, 1911, 1912, 1913, 1914, 1915, 1916, 1917, 1918, 1919, 1921, 1922, 1923, 1929, 1930, 1931, 1932, 19 34, 1935, 1936, 1938, 1939, 1940, 1941, 1942, 1943, 1944, 1945, 1946, 1947, 1948, 1949, 1950, 1951, 1953, 1955, 1956, 1957, 1958, 1959, 1960, 1961, 1964, 1965, 1966, 1967, 1968, 1969, 1970, 1971, 1972, 1973, 1978, 1979, 1980, 1981, 1988, 1989, 1990, 1991, 1992, 1993, 1994, 1995, 1996, 1997, 1999, 2000, 2001, 2002, 2003, 2004, 2005, 20 06, 2007, 2009, 2010, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022, 2023, 2026, 2027, 2028, 2029, 2030, 2031, 2032, 2034, 2035, 2036, 2037, 2038, 2039, 2041, 2042, 2043, 2044, 2045, 2046, 2047, 2048, 2049, 2050, 2051, 2052, 2053, 2054, 2055, 2056, 2057, 2059, 2060, 2066, 2067, 2068, 2071, 2073, 2074, 2075, 2076, 20 77, 2078, 2080] Instantiate the Mol GAN model and set the learning rate and maximum number of atoms as the size of the vertices. Then, we create the dataset in the format of the input to Mol GAN. # create model gan = Mol GAN ( learning_rate = Exponential Decay ( 0. 001, 0. 9, 5000 ), vertices = num_atoms ) dataset = dc. data. Numpy Dataset ([ x. adjacency_matrix for x in features ],[ x. node_features for x in features ]) Define the iterbatches function because the gan_fit function requires an iterable for the batches. def iterbatches ( epochs ): for i in range ( epochs ): for batch in dataset. iterbatches ( batch_size = gan. batch_size, pad_batches = True ): flattened_adjacency = torch. from_numpy ( batch [ 0 ]). view (-1 ). to ( dtype = torch. int64 ) # flatten the input because torch. nn. functional. one_hot only works with 1D inputs | deepchem.pdf |
invalid_mask = ( flattened_adjacency < 0 ) | ( flattened_adjacency >= gan. edges ) # edge type cannot be negative or >= gan. edges, these entries are invalid clamped_adjacency = torch. clamp ( flattened_adjacency, 0, gan. edges-1 ) # clamp the input so it can be fed to the one_hot function adjacency_tensor = one_hot ( clamped_adjacency, num_classes = gan. edges ) # actual one_hot adjacency_tensor [ invalid_mask ] = torch. zeros ( gan. edges, dtype = torch. long ) # make the invalid entries, a vector of zeros adjacency_tensor = adjacency_tensor. view ( * batch [ 0 ]. shape, -1 ) # reshape to original shape and change dtype for stability. flattened_node = torch. from_numpy ( batch [ 1 ]). view (-1 ). to ( dtype = torch. int64 ) invalid_mask = ( flattened_node < 0 ) | ( flattened_node >= gan. nodes ) clamped_node = torch. clamp ( flattened_node, 0, gan. nodes-1 ) node_tensor = one_hot ( clamped_node, num_classes = gan. nodes ) node_tensor [ invalid_mask ] = torch. zeros ( gan. nodes, dtype = torch. long ) node_tensor = node_tensor. view ( * batch [ 1 ]. shape, -1 ) yield { gan. data_inputs [ 0 ]: adjacency_tensor, gan. data_inputs [ 1 ]: node_tensor } Train the model with the fit_gan function and generate molecules with the predict_gan_generator function. gan. fit_gan ( iterbatches ( 25 ), generator_steps = 0. 2, checkpoint_interval = 5000 ) generated_data = gan. predict_gan_generator ( 1000 ) /usr/local/lib/python3. 10/dist-packages/torch/autograd/graph. py:744: User Warning: Attempting to run cu BLAS, but there was no current CUDA context! Attempting to set the primary context... (Triggered internally at ../aten/src /ATen/cuda/Cublas Handle Pool. cpp:135. ) return Variable. _execution_engine. run_backward( # Calls into the C++ engine to run the backward pass Ending global_step 349: generator average loss -3. 55075, discriminator average loss -5. 23049 TIMING: model fitting took 8. 770 s Generating 1000 samples Convert the generated graphs to RDKit molecules. nmols = feat. defeaturize ( generated_data ) print ( " {} molecules generated". format ( len ( nmols ))) [13:29:24] Explicit valence for atom # 2 O, 30, is greater than permitted [13:29:24] Explicit valence for atom # 0 C, 26, is greater than permitted [13:29:24] non-ring atom 0 marked aromatic 1000 molecules generated [13:29:24] Explicit valence for atom # 1 B, 7, is greater than permitted [13:29:24] non-ring atom 2 marked aromatic [13:29:24] Explicit valence for atom # 1 C, 11, is greater than permitted Remove invalid molecules from list. nmols = list ( filter ( lambda x : x is not None, nmols )) Print out the number of valid molecules, but training can be unstable so some the number can vary significantly. # currently training is unstable so 0 is a common outcome print ( " {} valid molecules". format ( len ( nmols ))) 411 valid molecules Remove duplicate generated molecules. nmols_smiles = [ Chem. Mol To Smiles ( m ) for m in nmols ] nmols_smiles_unique = list ( Ordered Dict. fromkeys ( nmols_smiles )) nmols_viz = [ Chem. Mol From Smiles ( x ) for x in nmols_smiles_unique ] print ( " {} unique valid molecules". format ( len ( nmols_viz ))) 48 unique valid molecules Print out up to 100 unique valid molecules. img = Draw. Mols To Grid Image ( nmols_viz [ 0 : 100 ], mols Per Row = 5, sub Img Size = ( 250, 250 ), max Mols = 100, legends = None, return PNG img | deepchem.pdf |
deepchem.pdf |
|
This is an example of what the molecules should look like. | deepchem.pdf |
Introduction to GROVER In this tutorial, we will go over what Grover is, and how to get it up and running. GROVER, or, Graph Representation fr Om selfsuper Vised m Essage passing t Ransformer, is a novel framework proposed by Tencent AI Lab. GROVER utilizes self-supervised tasks in the node, edge and graph level in order to learn rich structural and semantic information of molecules from large unlabelled molecular datasets. GROVER integrates Message Passing Networks into a Transformer-style architecture to deliver more expressive molecular encoding. Reference Paper: Rong, Yu, et al. "Grover: Self-supervised message passing transformer on large-scale molecular data. " Advances in Neural Information Processing Systems (2020). Colab This tutorial and the rest in this sequence are designed to be done in Google colab. If you'd like to open this notebook in colab, you can use the following link. O p e n i n C o l a b O p e n i n C o l a b Setup To run Deep Chem within Colab, you'll need to run the following installation commands. This will take about 5 minutes to run to completion and install your environment. You can of course run this tutorial locally if you prefer. In that case, don't run these cells since they will download and install Anaconda on your local machine. Import and Setup required modules. We will first clone the repository onto the preferred platform, then install it as a library. We will also import deepchem and install descriptastorus. NOTE: The original GROVER repository does not contain a setup. py file, thus we are currently using a fork which does. # Clone the forked repository. % cd drive / My Drive ! git clone https : // github. com / atreyamaj / grover. git /content/drive/My Drive fatal: destination path 'grover' already exists and is not an empty directory. # Navigate to the working folder. % cd grover /content/drive/My Drive/grover # Install the forked repository. ! pip install -e . / Obtaining file:///content/drive/My Drive/grover Installing collected packages: grover Running setup. py develop for grover Successfully installed grover-1. 0. 0 # Install deepchem and descriptastorus. ! pip install deepchem ! pip install git + https : // github. com / bp-kelley / descriptastorus Collecting deepchem Downloading deepchem-2. 6. 1-py3-none-any. whl (608 k B) |β | 10 k B 29. 8 MB/s eta 0:00:01 |β | 20 k B 34. 5 MB/s eta 0:00:01 |ββ | 30 k B 37. 0 MB/s eta 0:00:01 |βββ | 40 k B 20. 6 MB/s eta 0:00:01 |βββ | 51 k B 23. 0 MB/s eta 0:00:01 |ββββ | 61 k B 25. 9 MB/s eta 0:00:01 |ββββ | 71 k B 23. 6 MB/s eta 0:00:01 |βββββ | 81 k B 24. 8 MB/s eta 0:00:01 |βββββ | 92 k B 26. 6 MB/s eta 0:00:01 |ββββββ | 102 k B 28. 3 MB/s eta 0:00:01 |ββββββ | 112 k B 28. 3 MB/s eta 0:00:01 |βββββββ | 122 k B 28. 3 MB/s eta 0:00:01 |βββββββ | 133 k B 28. 3 MB/s eta 0:00:01 |ββββββββ | 143 k B 28. 3 MB/s eta 0:00:01 | deepchem.pdf |
|ββββββββ | 153 k B 28. 3 MB/s eta 0:00:01 |βββββββββ | 163 k B 28. 3 MB/s eta 0:00:01 |ββββββββββ | 174 k B 28. 3 MB/s eta 0:00:01 |ββββββββββ | 184 k B 28. 3 MB/s eta 0:00:01 |βββββββββββ | 194 k B 28. 3 MB/s eta 0:00:01 |βββββββββββ | 204 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββ | 215 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββ | 225 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββ | 235 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββ | 245 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββ | 256 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββ | 266 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββ | 276 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββ | 286 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββ | 296 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββ | 307 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββ | 317 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββ | 327 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββ | 337 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββ | 348 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββ | 358 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββ | 368 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββ | 378 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββββ | 389 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββββ | 399 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββ | 409 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββ | 419 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββββββ | 430 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββββ | 440 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββββ | 450 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββββββββ | 460 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββββββββ | 471 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββββββ | 481 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββββββ | 491 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββββββββββ | 501 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββββββββββ | 512 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββββββββ | 522 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββββββββ | 532 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββββββββββββ | 542 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββββββββββββ | 552 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββββββββββ | 563 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββββββββββββββ | 573 k B 28. 3 MB/s eta 0:00:01 |βββββββββββββββββββββββββββββββ | 583 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββββββββββββ| 593 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββββββββββββ| 604 k B 28. 3 MB/s eta 0:00:01 |ββββββββββββββββββββββββββββββββ| 608 k B 28. 3 MB/s Requirement already satisfied: scikit-learn in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 0. 2) Requirement already satisfied: numpy>=1. 21 in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 21. 6) Collecting rdkit-pypi Downloading rdkit_pypi-2022. 3. 1-cp37-cp37m-manylinux_2_17_x86_64. manylinux2014_x86_64. whl (22. 5 MB) |ββββββββββββββββββββββββββββββββ| 22. 5 MB 1. 4 MB/s Requirement already satisfied: scipy in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 4. 1) Requirement already satisfied: pandas in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 3. 5) Requirement already satisfied: joblib in /usr/local/lib/python3. 7/dist-packages (from deepchem) (1. 1. 0) Requirement already satisfied: pytz>=2017. 3 in /usr/local/lib/python3. 7/dist-packages (from pandas->deepchem) (2 022. 1) Requirement already satisfied: python-dateutil>=2. 7. 3 in /usr/local/lib/python3. 7/dist-packages (from pandas->de epchem) (2. 8. 2) Requirement already satisfied: six>=1. 5 in /usr/local/lib/python3. 7/dist-packages (from python-dateutil>=2. 7. 3-> pandas->deepchem) (1. 15. 0) Requirement already satisfied: Pillow in /usr/local/lib/python3. 7/dist-packages (from rdkit-pypi->deepchem) (7. 1. 2) Requirement already satisfied: threadpoolctl>=2. 0. 0 in /usr/local/lib/python3. 7/dist-packages (from scikit-learn->deepchem) (3. 1. 0) Installing collected packages: rdkit-pypi, deepchem Successfully installed deepchem-2. 6. 1 rdkit-pypi-2022. 3. 1 Collecting git+https://github. com/bp-kelley/descriptastorus Cloning https://github. com/bp-kelley/descriptastorus to /tmp/pip-req-build-_462lldf Running command git clone -q https://github. com/bp-kelley/descriptastorus /tmp/pip-req-build-_462lldf Collecting pandas_flavor Downloading pandas_flavor-0. 3. 0-py3-none-any. whl (6. 3 k B) Requirement already satisfied: xarray in /usr/local/lib/python3. 7/dist-packages (from pandas_flavor->descriptast orus==2. 3. 0. 6) (0. 18. 2) Downloading pandas_flavor-0. 2. 0-py2. py3-none-any. whl (6. 6 k B) Requirement already satisfied: pandas in /usr/local/lib/python3. 7/dist-packages (from pandas_flavor->descriptast orus==2. 3. 0. 6) (1. 3. 5) Requirement already satisfied: python-dateutil>=2. 7. 3 in /usr/local/lib/python3. 7/dist-packages (from pandas->pa ndas_flavor->descriptastorus==2. 3. 0. 6) (2. 8. 2) Requirement already satisfied: pytz>=2017. 3 in /usr/local/lib/python3. 7/dist-packages (from pandas->pandas_flavo r->descriptastorus==2. 3. 0. 6) (2022. 1) Requirement already satisfied: numpy>=1. 17. 3 in /usr/local/lib/python3. 7/dist-packages (from pandas->pandas_flav or->descriptastorus==2. 3. 0. 6) (1. 21. 6) Requirement already satisfied: six>=1. 5 in /usr/local/lib/python3. 7/dist-packages (from python-dateutil>=2. 7. 3-> | deepchem.pdf |
pandas->pandas_flavor->descriptastorus==2. 3. 0. 6) (1. 15. 0) Requirement already satisfied: setuptools>=40. 4 in /usr/local/lib/python3. 7/dist-packages (from xarray->pandas_f lavor->descriptastorus==2. 3. 0. 6) (57. 4. 0) Building wheels for collected packages: descriptastorus Building wheel for descriptastorus (setup. py) ... done Created wheel for descriptastorus: filename=descriptastorus-2. 3. 0. 6-py3-none-any. whl size=60704 sha256=10872f9 972ee502829c712449b7dbd8d54717461dce2fdffe495f21e10044446 Stored in directory: /tmp/pip-ephem-wheel-cache-k9kvyu6l/wheels/f9/c3/4f/e7d01f4f2f1a89aef8f0ef088beb4a9497632 4f3ee21410b10 Successfully built descriptastorus Installing collected packages: pandas-flavor, descriptastorus Successfully installed descriptastorus-2. 3. 0. 6 pandas-flavor-0. 2. 0 Extracting semantic motif labels The semantic motif label is extracted by scripts/save_feature. py with feature generator fgtasklabel. ! python scripts / save_features. py --data_path exampledata / pretrain / tryout. csv \ --save_path exampledata / pretrain / tryout. npz \ --features_generator fgtasklabel \ --restart WARNING:root:No normalization for BCUT2D_MWHI WARNING:root:No normalization for BCUT2D_MWLOW WARNING:root:No normalization for BCUT2D_CHGHI WARNING:root:No normalization for BCUT2D_CHGLO WARNING:root:No normalization for BCUT2D_LOGPHI WARNING:root:No normalization for BCUT2D_LOGPLOW WARNING:root:No normalization for BCUT2D_MRHI WARNING:root:No normalization for BCUT2D_MRLOW 100% 5970/5970 [00:09<00:00, 620. 91it/s] Extracting atom/bond contextual properties (vocabulary) The atom/bond Contextual Property (Vocabulary) is extracted by scripts/build_vocab. py. ! python scripts / build_vocab. py --data_path exampledata / pretrain / tryout. csv \ --vocab_save_folder exampledata / pretrain \ --dataset_name tryout WARNING:root:No normalization for BCUT2D_MWHI WARNING:root:No normalization for BCUT2D_MWLOW WARNING:root:No normalization for BCUT2D_CHGHI WARNING:root:No normalization for BCUT2D_CHGLO WARNING:root:No normalization for BCUT2D_LOGPHI WARNING:root:No normalization for BCUT2D_LOGPLOW WARNING:root:No normalization for BCUT2D_MRHI WARNING:root:No normalization for BCUT2D_MRLOW Building atom vocab from file: exampledata/pretrain/tryout. csv 50000it [00:04, 10946. 14it/s] atom vocab size 324 Building bond vocab from file: exampledata/pretrain/tryout. csv 50000it [00:16, 3094. 21it/s] bond vocab size 353 Splitting the data To accelerate the data loading and reduce the memory cost in the multi-gpu pretraining scenario, the unlabelled molecular data need to be spilt into several parts using scripts/split_data. py. ! python scripts / split_data. py --data_path exampledata / pretrain / tryout. csv \ --features_path exampledata / pretrain / tryout. npz \ --sample_per_file 100 \ --output_path exampledata / pretrain / tryout WARNING:root:No normalization for BCUT2D_MWHI WARNING:root:No normalization for BCUT2D_MWLOW WARNING:root:No normalization for BCUT2D_CHGHI WARNING:root:No normalization for BCUT2D_CHGLO WARNING:root:No normalization for BCUT2D_LOGPHI WARNING:root:No normalization for BCUT2D_LOGPLOW WARNING:root:No normalization for BCUT2D_MRHI WARNING:root:No normalization for BCUT2D_MRLOW Number of files: 60 Running Pretraining on Single GPU ! python main. py pretrain \ | deepchem.pdf |
--data_path exampledata / pretrain / tryout \ --save_dir model / tryout \ --atom_vocab_path exampledata / pretrain / tryout_atom_vocab. pkl \ --bond_vocab_path exampledata / pretrain / tryout_bond_vocab. pkl \ --batch_size 32 \ --dropout 0. 1 \ --depth 5 \ --num_attn_head 1 \ --hidden_size 100 \ --epochs 3 \ --init_lr 0. 0002 \ --max_lr 0. 0004 \ --final_lr 0. 0001 \ --weight_decay 0. 0000001 \ --activation PRe LU \ --backbone gtrans \ --embedding_output_type both WARNING:root:No normalization for BCUT2D_MWHI WARNING:root:No normalization for BCUT2D_MWLOW WARNING:root:No normalization for BCUT2D_CHGHI WARNING:root:No normalization for BCUT2D_CHGLO WARNING:root:No normalization for BCUT2D_LOGPHI WARNING:root:No normalization for BCUT2D_LOGPLOW WARNING:root:No normalization for BCUT2D_MRHI WARNING:root:No normalization for BCUT2D_MRLOW [WARNING] Horovod cannot be imported; multi-GPU training is unsupported Namespace(activation='PRe LU', atom_vocab_path='exampledata/pretrain/tryout_atom_vocab. pkl', backbone='gtrans', b atch_size=32, bias=False, bond_drop_rate=0, bond_vocab_path='exampledata/pretrain/tryout_bond_vocab. pkl', cuda=T rue, data_path='exampledata/pretrain/tryout', dense=False, depth=5, dist_coff=0. 1, dropout=0. 1, embedding_output _type='both', enable_multi_gpu=False, epochs=3, fg_label_path=None, final_lr=0. 0001, fine_tune_coff=1, hidden_si ze=100, init_lr=0. 0002, max_lr=0. 0004, no_cache=True, num_attn_head=1, num_mt_block=1, parser_name='pretrain', s ave_dir='model/tryout', save_interval=9999999999, undirected=False, warmup_epochs=2. 0, weight_decay=1e-07) Loading data Loading data: Number of files: 60 Number of samples: 5970 Samples/file: 100 Splitting data with seed 0. Total size = 5,970 | train size = 5,400 | val size = 570 atom vocab size: 324, bond vocab size: 353, Number of FG tasks: 85 Pre-loaded test data: 6 /usr/local/lib/python3. 7/dist-packages/torch/utils/data/dataloader. py:481: User Warning: This Data Loader will cre ate 12 worker processes in total. Our suggested max number of worker in current system is 2, which is smaller th an what this Data Loader is going to create. Please be aware that excessive worker creation might get Data Loader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary. cpuset_checked)) /usr/local/lib/python3. 7/dist-packages/torch/utils/data/dataloader. py:481: User Warning: This Data Loader will cre ate 10 worker processes in total. Our suggested max number of worker in current system is 2, which is smaller th an what this Data Loader is going to create. Please be aware that excessive worker creation might get Data Loader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary. cpuset_checked)) Restore checkpoint, current epoch: 2 GROVEREmbedding( (encoders): GTrans Encoder( (edge_blocks): Module List( (0): MTBlock( (heads): Module List( (0): Head( (mpn_q): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) (mpn_k): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) (mpn_v): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) ) ) (act_func): PRe LU(num_parameters=1) (dropout_layer): Dropout(p=0. 1, inplace=False) (layernorm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (W_i): Linear(in_features=165, out_features=100, bias=False) (attn): Multi Headed Attention( (linear_layers): Module List( (0): Linear(in_features=100, out_features=100, bias=True) (1): Linear(in_features=100, out_features=100, bias=True) | deepchem.pdf |
(2): Linear(in_features=100, out_features=100, bias=True) ) (output_linear): Linear(in_features=100, out_features=100, bias=False) (attention): Attention() (dropout): Dropout(p=0. 1, inplace=False) ) (W_o): Linear(in_features=100, out_features=100, bias=False) (sublayer): Sublayer Connection( (norm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (dropout): Dropout(p=0. 1, inplace=False) ) ) ) (node_blocks): Module List( (0): MTBlock( (heads): Module List( (0): Head( (mpn_q): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) (mpn_k): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) (mpn_v): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) ) ) (act_func): PRe LU(num_parameters=1) (dropout_layer): Dropout(p=0. 1, inplace=False) (layernorm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (W_i): Linear(in_features=151, out_features=100, bias=False) (attn): Multi Headed Attention( (linear_layers): Module List( (0): Linear(in_features=100, out_features=100, bias=True) (1): Linear(in_features=100, out_features=100, bias=True) (2): Linear(in_features=100, out_features=100, bias=True) ) (output_linear): Linear(in_features=100, out_features=100, bias=False) (attention): Attention() (dropout): Dropout(p=0. 1, inplace=False) ) (W_o): Linear(in_features=100, out_features=100, bias=False) (sublayer): Sublayer Connection( (norm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (dropout): Dropout(p=0. 1, inplace=False) ) ) ) (ffn_atom_from_atom): Positionwise Feed Forward( (W_1): Linear(in_features=251, out_features=400, bias=True) (W_2): Linear(in_features=400, out_features=100, bias=True) (dropout): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) ) (ffn_atom_from_bond): Positionwise Feed Forward( (W_1): Linear(in_features=251, out_features=400, bias=True) (W_2): Linear(in_features=400, out_features=100, bias=True) (dropout): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) ) (ffn_bond_from_atom): Positionwise Feed Forward( (W_1): Linear(in_features=265, out_features=400, bias=True) (W_2): Linear(in_features=400, out_features=100, bias=True) (dropout): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) ) (ffn_bond_from_bond): Positionwise Feed Forward( (W_1): Linear(in_features=265, out_features=400, bias=True) (W_2): Linear(in_features=400, out_features=100, bias=True) (dropout): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) ) (atom_from_atom_sublayer): Sublayer Connection( (norm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (dropout): Dropout(p=0. 1, inplace=False) ) | deepchem.pdf |
(atom_from_bond_sublayer): Sublayer Connection( (norm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (dropout): Dropout(p=0. 1, inplace=False) ) (bond_from_atom_sublayer): Sublayer Connection( (norm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (dropout): Dropout(p=0. 1, inplace=False) ) (bond_from_bond_sublayer): Sublayer Connection( (norm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (dropout): Dropout(p=0. 1, inplace=False) ) (act_func_node): PRe LU(num_parameters=1) (act_func_edge): PRe LU(num_parameters=1) (dropout_layer): Dropout(p=0. 1, inplace=False) ) ) Total parameters: 768614 EP:3 Model Saved on: model/tryout/model. ep3 Total Time: 14. 828 Training and Finetuning Extracting Molecular Features Given a labelled molecular dataset, it is possible to extract the additional molecular features in order to train & finetune the model from the existing pretrained model. The feature matrix is stored as . npz. ! python scripts / save_features. py --data_path exampledata / finetune / bbbp. csv \ --save_path exampledata / finetune / bbbp. npz \ --features_generator rdkit_2d_normalized \ --restart WARNING:root:No normalization for BCUT2D_MWHI WARNING:root:No normalization for BCUT2D_MWLOW WARNING:root:No normalization for BCUT2D_CHGHI WARNING:root:No normalization for BCUT2D_CHGLO WARNING:root:No normalization for BCUT2D_LOGPHI WARNING:root:No normalization for BCUT2D_LOGPLOW WARNING:root:No normalization for BCUT2D_MRHI WARNING:root:No normalization for BCUT2D_MRLOW [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors [21:04:21] WARNING: not removing hydrogen atom without neighbors | deepchem.pdf |
[21:04:21] WARNING: not removing hydrogen atom without neighbors 0% 6/2039 [00:01<06:41, 5. 07it/s][21:04:23] WARNING: not removing hydrogen atom without neighbors 3% 53/2039 [00:02<00:56, 35. 31it/s][21:04:24] WARNING: not removing hydrogen atom without neighbors 5% 95/2039 [00:04<01:22, 23. 62it/s][21:04:26] WARNING: not removing hydrogen atom without neighbors 6% 115/2039 [00:04<00:50, 38. 29it/s][21:04:27] WARNING: not removing hydrogen atom without neighbors 7% 152/2039 [00:06<01:23, 22. 62it/s][21:04:28] WARNING: not removing hydrogen atom without neighbors 8% 169/2039 [00:07<01:46, 17. 53it/s][21:04:29] WARNING: not removing hydrogen atom without neighbors [21:04:29] WARNING: not removing hydrogen atom without neighbors 10% 198/2039 [00:08<01:02, 29. 58it/s][21:04:30] WARNING: not removing hydrogen atom without neighbors 13% 268/2039 [00:11<01:06, 26. 67it/s][21:04:33] WARNING: not removing hydrogen atom without neighbors [21:04:33] WARNING: not removing hydrogen atom without neighbors [21:04:33] WARNING: not removing hydrogen atom without neighbors 15% 306/2039 [00:12<01:08, 25. 31it/s][21:04:34] WARNING: not removing hydrogen atom without neighbors [21:04:34] WARNING: not removing hydrogen atom without neighbors [21:04:34] WARNING: not removing hydrogen atom without neighbors 18% 372/2039 [00:16<03:04, 9. 03it/s][21:04:38] WARNING: not removing hydrogen atom without neighbors 21% 424/2039 [00:16<00:46, 34. 78it/s][21:04:39] WARNING: not removing hydrogen atom without neighbors [21:04:39] WARNING: not removing hydrogen atom without neighbors 23% 471/2039 [00:18<00:51, 30. 55it/s][21:04:40] WARNING: not removing hydrogen atom without neighbors 26% 528/2039 [00:20<00:56, 26. 79it/s][21:04:42] WARNING: not removing hydrogen atom without neighbors 27% 556/2039 [00:21<00:58, 25. 56it/s]WARNING: not removing hydrogen atom without neighbors 28% 571/2039 [00:22<01:09, 21. 01it/s][21:04:44] WARNING: not removing hydrogen atom without neighbors 29% 598/2039 [00:23<01:02, 23. 22it/s][21:04:45] WARNING: not removing hydrogen atom without neighbors 30% 619/2039 [00:24<00:50, 28. 33it/s][21:04:45] WARNING: not removing hydrogen atom without neighbors [21:04:45] WARNING: not removing hydrogen atom without neighbors 32% 645/2039 [00:25<00:52, 26. 57it/s][21:04:47] WARNING: not removing hydrogen atom without neighbors [21:04:47] WARNING: not removing hydrogen atom without neighbors 34% 689/2039 [00:27<00:53, 25. 23it/s][21:04:48] WARNING: not removing hydrogen atom without neighbors 35% 713/2039 [00:28<01:02, 21. 14it/s][21:04:50] WARNING: not removing hydrogen atom without neighbors 39% 800/2039 [00:32<02:30, 8. 23it/s]WARNING: not removing hydrogen atom without neighbors 40% 813/2039 [00:33<01:26, 14. 10it/s][21:04:55] WARNING: not removing hydrogen atom without neighbors 44% 901/2039 [00:36<00:45, 24. 82it/s][21:04:58] WARNING: not removing hydrogen atom without neighbors 44% 905/2039 [00:36<00:46, 24. 15it/s][21:04:58] WARNING: not removing hydrogen atom without neighbors 45% 908/2039 [00:37<01:08, 16. 55it/s][21:04:59] WARNING: not removing hydrogen atom without neighbors [21:04:59] WARNING: not removing hydrogen atom without neighbors [21:04:59] WARNING: not removing hydrogen atom without neighbors [21:05:00] WARNING: not removing hydrogen atom without neighbors 50% 1010/2039 [00:40<00:30, 33. 63it/s][21:05:02] WARNING: not removing hydrogen atom without neighbors 51% 1039/2039 [00:41<00:34, 28. 94it/s][21:05:03] WARNING: not removing hydrogen atom without neighbors 56% 1134/2039 [00:45<00:33, 26. 85it/s][21:05:07] WARNING: not removing hydrogen atom without neighbors 56% 1150/2039 [00:46<00:37, 23. 89it/s][21:05:07] WARNING: not removing hydrogen atom without neighbors 57% 1161/2039 [00:46<00:29, 29. 93it/s][21:05:08] WARNING: not removing hydrogen atom without neighbors 57% 1168/2039 [00:46<00:35, 24. 50it/s]WARNING: not removing hydrogen atom without neighbors 58% 1186/2039 [00:47<00:24, 34. 22it/s][21:05:09] WARNING: not removing hydrogen atom without neighbors 58% 1192/2039 [00:47<00:21, 39. 24it/s][21:05:09] WARNING: not removing hydrogen atom without neighbors 61% 1235/2039 [00:49<00:34, 23. 35it/s][21:05:11] WARNING: not removing hydrogen atom without neighbors 62% 1264/2039 [00:50<00:32, 23. 94it/s][21:05:12] WARNING: not removing hydrogen atom without neighbors 62% 1268/2039 [00:50<00:29, 25. 95it/s][21:05:12] WARNING: not removing hydrogen atom without neighbors 62% 1273/2039 [00:50<00:25, 29. 91it/s][21:05:12] WARNING: not removing hydrogen atom without neighbors 63% 1289/2039 [00:51<00:35, 21. 19it/s][21:05:13] WARNING: not removing hydrogen atom without neighbors 63% 1294/2039 [00:51<00:30, 24. 51it/s][21:05:13] WARNING: not removing hydrogen atom without neighbors 64% 1297/2039 [00:51<00:30, 24. 42it/s][21:05:13] WARNING: not removing hydrogen atom without neighbors 64% 1308/2039 [00:52<00:29, 24. 50it/s] 65% 1318/2039 [00:52<00:31, 23. 25it/s][21:05:14] WARNING: not removing hydrogen atom without neighbors 66% 1339/2039 [00:53<00:19, 35. 84it/s][21:05:14] WARNING: not removing hydrogen atom without neighbors 66% 1354/2039 [00:53<00:27, 25. 00it/s][21:05:15] WARNING: not removing hydrogen atom without neighbors 68% 1384/2039 [00:55<00:30, 21. 15it/s][21:05:16] WARNING: not removing hydrogen atom without neighbors 71% 1443/2039 [00:57<00:26, 22. 41it/s][21:05:18] WARNING: not removing hydrogen atom without neighbors 71% 1451/2039 [00:57<00:22, 26. 27it/s][21:05:19] WARNING: not removing hydrogen atom without neighbors 72% 1467/2039 [00:58<00:29, 19. 33it/s][21:05:20] WARNING: not removing hydrogen atom without neighbors 72% 1471/2039 [00:58<00:26, 21. 24it/s][21:05:20] WARNING: not removing hydrogen atom without neighbors 77% 1568/2039 [01:02<00:20, 23. 17it/s][21:05:23] WARNING: not removing hydrogen atom without neighbors 78% 1591/2039 [01:02<00:20, 22. 13it/s][21:05:24] WARNING: not removing hydrogen atom without neighbors[21:05:24 ] WARNING: not removing hydrogen atom without neighbors [21:05:24] WARNING: not removing hydrogen atom without neighbors 79% 1616/2039 [01:03<00:15, 26. 82it/s][21:05:25] WARNING: not removing hydrogen atom without neighbors 79% 1620/2039 [01:04<00:19, 21. 82it/s][21:05:25] WARNING: not removing hydrogen atom without neighbors 82% 1665/2039 [01:05<00:11, 31. 94it/s][21:05:27] WARNING: not removing hydrogen atom without neighbors 82% 1671/2039 [01:05<00:12, 29. 68it/s][21:05:27] WARNING: not removing hydrogen atom without neighbors 83% 1685/2039 [01:06<00:17, 19. 79it/s][21:05:28] WARNING: not removing hydrogen atom without neighbors [21:05:28] WARNING: not removing hydrogen atom without neighbors 83% 1689/2039 [01:06<00:17, 20. 30it/s][21:05:28] WARNING: not removing hydrogen atom without neighbors 84% 1713/2039 [01:07<00:14, 22. 54it/s][21:05:29] WARNING: not removing hydrogen atom without neighbors 85% 1731/2039 [01:08<00:12, 25. 18it/s][21:05:30] WARNING: not removing hydrogen atom without neighbors 87% 1766/2039 [01:09<00:08, 32. 14it/s][21:05:31] WARNING: not removing hydrogen atom without neighbors [21:05:31] WARNING: not removing hydrogen atom without neighbors 88% 1792/2039 [01:10<00:09, 25. 06it/s][21:05:32] WARNING: not removing hydrogen atom without neighbors 92% 1870/2039 [01:13<00:05, 30. 02it/s][21:05:35] WARNING: not removing hydrogen atom without neighbors 93% 1896/2039 [01:14<00:05, 27. 70it/s][21:05:36] WARNING: not removing hydrogen atom without neighbors 95% 1947/2039 [01:16<00:03, 24. 29it/s][21:05:38] WARNING: not removing hydrogen atom without neighbors 97% 1976/2039 [01:17<00:02, 29. 55it/s][21:05:39] WARNING: not removing hydrogen atom without neighbors 100% 2039/2039 [01:19<00:00, 25. 67it/s] | deepchem.pdf |
Finetuning with existing data Given the labelled dataset and the molecular features, we can use finetune function to finetune the pretrained model. ! python main. py finetune --data_path exampledata / finetune / bbbp. csv \ --features_path exampledata / finetune / bbbp. npz \ --save_dir model / finetune / bbbp / \ --checkpoint_path model / tryout / model. ep3 \ --dataset_type classification \ --split_type scaffold_balanced \ --ensemble_size 1 \ --num_folds 3 \ --no_features_scaling \ --ffn_hidden_size 200 \ --batch_size 32 \ --epochs 10 \ --init_lr 0. 00015 WARNING:root:No normalization for BCUT2D_MWHI WARNING:root:No normalization for BCUT2D_MWLOW WARNING:root:No normalization for BCUT2D_CHGHI WARNING:root:No normalization for BCUT2D_CHGLO WARNING:root:No normalization for BCUT2D_LOGPHI WARNING:root:No normalization for BCUT2D_LOGPLOW WARNING:root:No normalization for BCUT2D_MRHI WARNING:root:No normalization for BCUT2D_MRLOW [WARNING] Horovod cannot be imported; multi-GPU training is unsupported Fold 0 Loading data Number of tasks = 1 Splitting data with seed 0 100% 2039/2039 [00:00<00:00, 3681. 51it/s] Total scaffolds = 1,025 | train scaffolds = 764 | val scaffolds = 123 | test scaffolds = 138 Label averages per scaffold, in decreasing order of scaffold frequency,capped at 10 scaffolds and 20 labels: [(a rray([0. 72992701]), array([137])), (array([1. ]), array([1])), (array([0. ]), array([1])), (array([1. ]), array([1] )), (array([1. ]), array([1])), (array([0. ]), array([1])), (array([1. ]), array([1])), (array([1. ]), array([2])), (array([0. ]), array([2])), (array([1. ]), array([1]))] Class sizes p_np 0: 23. 49%, 1: 76. 51% Total size = 2,039 | train size = 1,631 | val size = 203 | test size = 205 Loading model 0 from model/tryout/model. ep3 Loading pretrained parameter "grover. encoders. edge_blocks. 0. heads. 0. mpn_q. act_func. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. heads. 0. mpn_q. W_h. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. heads. 0. mpn_k. act_func. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. heads. 0. mpn_k. W_h. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. heads. 0. mpn_v. act_func. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. heads. 0. mpn_v. W_h. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. act_func. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. layernorm. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. layernorm. bias". Loading pretrained parameter "grover. encoders. edge_blocks. 0. W_i. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. attn. linear_layers. 0. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. attn. linear_layers. 0. bias". Loading pretrained parameter "grover. encoders. edge_blocks. 0. attn. linear_layers. 1. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. attn. linear_layers. 1. bias". Loading pretrained parameter "grover. encoders. edge_blocks. 0. attn. linear_layers. 2. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. attn. linear_layers. 2. bias". Loading pretrained parameter "grover. encoders. edge_blocks. 0. attn. output_linear. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. W_o. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. sublayer. norm. weight". Loading pretrained parameter "grover. encoders. edge_blocks. 0. sublayer. norm. bias". Loading pretrained parameter "grover. encoders. node_blocks. 0. heads. 0. mpn_q. act_func. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. heads. 0. mpn_q. W_h. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. heads. 0. mpn_k. act_func. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. heads. 0. mpn_k. W_h. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. heads. 0. mpn_v. act_func. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. heads. 0. mpn_v. W_h. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. act_func. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. layernorm. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. layernorm. bias". Loading pretrained parameter "grover. encoders. node_blocks. 0. W_i. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. attn. linear_layers. 0. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. attn. linear_layers. 0. bias". Loading pretrained parameter "grover. encoders. node_blocks. 0. attn. linear_layers. 1. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. attn. linear_layers. 1. bias". Loading pretrained parameter "grover. encoders. node_blocks. 0. attn. linear_layers. 2. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. attn. linear_layers. 2. bias". Loading pretrained parameter "grover. encoders. node_blocks. 0. attn. output_linear. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. W_o. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. sublayer. norm. weight". Loading pretrained parameter "grover. encoders. node_blocks. 0. sublayer. norm. bias". Loading pretrained parameter "grover. encoders. ffn_atom_from_atom. W_1. weight". | deepchem.pdf |
Loading pretrained parameter "grover. encoders. ffn_atom_from_atom. W_1. bias". Loading pretrained parameter "grover. encoders. ffn_atom_from_atom. W_2. weight". Loading pretrained parameter "grover. encoders. ffn_atom_from_atom. W_2. bias". Loading pretrained parameter "grover. encoders. ffn_atom_from_atom. act_func. weight". Loading pretrained parameter "grover. encoders. ffn_atom_from_bond. W_1. weight". Loading pretrained parameter "grover. encoders. ffn_atom_from_bond. W_1. bias". Loading pretrained parameter "grover. encoders. ffn_atom_from_bond. W_2. weight". Loading pretrained parameter "grover. encoders. ffn_atom_from_bond. W_2. bias". Loading pretrained parameter "grover. encoders. ffn_atom_from_bond. act_func. weight". Loading pretrained parameter "grover. encoders. ffn_bond_from_atom. W_1. weight". Loading pretrained parameter "grover. encoders. ffn_bond_from_atom. W_1. bias". Loading pretrained parameter "grover. encoders. ffn_bond_from_atom. W_2. weight". Loading pretrained parameter "grover. encoders. ffn_bond_from_atom. W_2. bias". Loading pretrained parameter "grover. encoders. ffn_bond_from_atom. act_func. weight". Loading pretrained parameter "grover. encoders. ffn_bond_from_bond. W_1. weight". Loading pretrained parameter "grover. encoders. ffn_bond_from_bond. W_1. bias". Loading pretrained parameter "grover. encoders. ffn_bond_from_bond. W_2. weight". Loading pretrained parameter "grover. encoders. ffn_bond_from_bond. W_2. bias". Loading pretrained parameter "grover. encoders. ffn_bond_from_bond. act_func. weight". Loading pretrained parameter "grover. encoders. atom_from_atom_sublayer. norm. weight". Loading pretrained parameter "grover. encoders. atom_from_atom_sublayer. norm. bias". Loading pretrained parameter "grover. encoders. atom_from_bond_sublayer. norm. weight". Loading pretrained parameter "grover. encoders. atom_from_bond_sublayer. norm. bias". Loading pretrained parameter "grover. encoders. bond_from_atom_sublayer. norm. weight". Loading pretrained parameter "grover. encoders. bond_from_atom_sublayer. norm. bias". Loading pretrained parameter "grover. encoders. bond_from_bond_sublayer. norm. weight". Loading pretrained parameter "grover. encoders. bond_from_bond_sublayer. norm. bias". Loading pretrained parameter "grover. encoders. act_func_node. weight". Loading pretrained parameter "grover. encoders. act_func_edge. weight". Pretrained parameter "av_task_atom. linear. weight" cannot be found in model parameters. Pretrained parameter "av_task_atom. linear. bias" cannot be found in model parameters. Pretrained parameter "av_task_bond. linear. weight" cannot be found in model parameters. Pretrained parameter "av_task_bond. linear. bias" cannot be found in model parameters. Pretrained parameter "bv_task_atom. linear. weight" cannot be found in model parameters. Pretrained parameter "bv_task_atom. linear. bias" cannot be found in model parameters. Pretrained parameter "bv_task_atom. linear_rev. weight" cannot be found in model parameters. Pretrained parameter "bv_task_atom. linear_rev. bias" cannot be found in model parameters. Pretrained parameter "bv_task_bond. linear. weight" cannot be found in model parameters. Pretrained parameter "bv_task_bond. linear. bias" cannot be found in model parameters. Pretrained parameter "bv_task_bond. linear_rev. weight" cannot be found in model parameters. Pretrained parameter "bv_task_bond. linear_rev. bias" cannot be found in model parameters. Pretrained parameter "fg_task_all. readout. cached_zero_vector" cannot be found in model parameters. Pretrained parameter "fg_task_all. linear_atom_from_atom. weight" cannot be found in model parameters. Pretrained parameter "fg_task_all. linear_atom_from_atom. bias" cannot be found in model parameters. Pretrained parameter "fg_task_all. linear_atom_from_bond. weight" cannot be found in model parameters. Pretrained parameter "fg_task_all. linear_atom_from_bond. bias" cannot be found in model parameters. Pretrained parameter "fg_task_all. linear_bond_from_atom. weight" cannot be found in model parameters. Pretrained parameter "fg_task_all. linear_bond_from_atom. bias" cannot be found in model parameters. Pretrained parameter "fg_task_all. linear_bond_from_bond. weight" cannot be found in model parameters. Pretrained parameter "fg_task_all. linear_bond_from_bond. bias" cannot be found in model parameters. Grover Finetune Task( (grover): GROVEREmbedding( (encoders): GTrans Encoder( (edge_blocks): Module List( (0): MTBlock( (heads): Module List( (0): Head( (mpn_q): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) (mpn_k): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) (mpn_v): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) ) ) (act_func): PRe LU(num_parameters=1) (dropout_layer): Dropout(p=0. 1, inplace=False) (layernorm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (W_i): Linear(in_features=165, out_features=100, bias=False) (attn): Multi Headed Attention( (linear_layers): Module List( (0): Linear(in_features=100, out_features=100, bias=True) (1): Linear(in_features=100, out_features=100, bias=True) (2): Linear(in_features=100, out_features=100, bias=True) | deepchem.pdf |
) (output_linear): Linear(in_features=100, out_features=100, bias=False) (attention): Attention() (dropout): Dropout(p=0. 1, inplace=False) ) (W_o): Linear(in_features=100, out_features=100, bias=False) (sublayer): Sublayer Connection( (norm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (dropout): Dropout(p=0. 1, inplace=False) ) ) ) (node_blocks): Module List( (0): MTBlock( (heads): Module List( (0): Head( (mpn_q): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) (mpn_k): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) (mpn_v): MPNEncoder( (dropout_layer): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) (W_h): Linear(in_features=100, out_features=100, bias=False) ) ) ) (act_func): PRe LU(num_parameters=1) (dropout_layer): Dropout(p=0. 1, inplace=False) (layernorm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (W_i): Linear(in_features=151, out_features=100, bias=False) (attn): Multi Headed Attention( (linear_layers): Module List( (0): Linear(in_features=100, out_features=100, bias=True) (1): Linear(in_features=100, out_features=100, bias=True) (2): Linear(in_features=100, out_features=100, bias=True) ) (output_linear): Linear(in_features=100, out_features=100, bias=False) (attention): Attention() (dropout): Dropout(p=0. 1, inplace=False) ) (W_o): Linear(in_features=100, out_features=100, bias=False) (sublayer): Sublayer Connection( (norm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (dropout): Dropout(p=0. 1, inplace=False) ) ) ) (ffn_atom_from_atom): Positionwise Feed Forward( (W_1): Linear(in_features=251, out_features=400, bias=True) (W_2): Linear(in_features=400, out_features=100, bias=True) (dropout): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) ) (ffn_atom_from_bond): Positionwise Feed Forward( (W_1): Linear(in_features=251, out_features=400, bias=True) (W_2): Linear(in_features=400, out_features=100, bias=True) (dropout): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) ) (ffn_bond_from_atom): Positionwise Feed Forward( (W_1): Linear(in_features=265, out_features=400, bias=True) (W_2): Linear(in_features=400, out_features=100, bias=True) (dropout): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) ) (ffn_bond_from_bond): Positionwise Feed Forward( (W_1): Linear(in_features=265, out_features=400, bias=True) (W_2): Linear(in_features=400, out_features=100, bias=True) (dropout): Dropout(p=0. 1, inplace=False) (act_func): PRe LU(num_parameters=1) ) (atom_from_atom_sublayer): Sublayer Connection( (norm): Layer Norm((100,), eps=1e-05, elementwise_affine=True) (dropout): Dropout(p=0. 1, inplace=False) ) (atom_from_bond_sublayer): Sublayer Connection( | deepchem.pdf |