Skip to content (access key 's')
Logo of Technion
Logo of CS Department
Logo of CS4People
Events

The Taub Faculty of Computer Science Events and Talks

Graph Neural Networks Pretraining Through Inherent Supervision for Molecular Property Prediction
event speaker icon
Roy Benjamin (M.Sc. Thesis Seminar)
event date icon
Sunday, 28.08.2022, 11:00
event location icon
Zoom Lecture: 96304969082
event speaker icon
Advisor: Dr. Kira Radinsky
Recent global events have emphasized the importance of accelerating the drug discovery process. This process may take more than a decade and its overall cost might exceed one billion dollars. A way to deal with these issues is to use machine learning to increase the rate at which drugs are made available to the public while reducing the cost of the entire process. However, chemical labeled data for real-world applications is extremely scarce making traditional approaches less effective. A promising course of action for this challenge is to pretrain a model using related tasks with large enough datasets, with the next step being finetuning it for the desired task. This is challenging as creating these datasets requires labeled data or expert knowledge. To aid in solving this pressing issue, in this thesis we introduce MISU - Molecular Inherent SUpervision, a unique method for pretraining graph neural networks for molecular property prediction. Our method leapfrogs past the need for labeled data or any expert knowledge by introducing three innovative components that utilize inherent properties of molecular graphs to induce information extraction at different scales, from the local neighborhood of an atom to substructures in the entire molecule. We evaluate our framework on six chemical property prediction tasks. We show that our method reaches state-of-the-art results compared to numerous baselines. We conduct a thorough ablation experiment and emphasize the contribution of each component in the method. In addition, we explore the effect of MISU on various GNN architectures and find our method is consistent with work done on supervised pretraining.