CHEM 750/7500-02 - Geometric and Attention-focused Deep Learning Methods for Molecular Systems: An Independent Reading Course

Semester: Fall 2023

Professor: S. Hopkins | Discipline: Physical | Campus: Waterloo

Description

Deep learning frameworks are rapidly advancing chemistry by enabling generalizable in silico predictions of numerous chemical quantities. From the accurate prediction of protein folding from only FASTA strings to generative networks that can successfully propose novel drugs to near-instantaneous prediction of molecular HOMO-LUMO gaps, deep learning models are now impacting nearly every domain of chemistry. In this course, students will be exposed to the different deep learning frameworks used in state-of-the-art cheminformatics. This will include learning about the fundamentals of deep learning, convolutional neural networks (CNNs), recurrent neural networks (RNNs), representation learning, transformers, graph neural networks (GNNs), and generative models such as generative adversarial networks (GANs), reinforcement learning techniques, and variational autoencoders, that are prevalent in today’s research landscape, with a specific emphasis to applications in the field of chemistry. Students will study these techniques as an independent literature review and will then complete a project by applying deep learning to a cheminformatics task.

Materials

Week 1: September 4th-8th Fundamentals of Deep Learning (1)

  • Linear Regression
  • Training, Validation, and Testing Splits
  • Stochastic Gradient Descent
  • Logistic Regression, Classification
  • Decision Trees
  • Multi-Level Perceptron Neural Network
  • Clustering Techniques (K-Means and K-nn)
  • Activation Functions

Readings:

  1. Hastie, T., Tibshirani, R., & Friedman, J. (n.d.). Springer Series in Statistics The Elements of Statistical Learning Data Mining, Inference, and Prediction.
  2. Géron, A. (2017). Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow (2019). https://www.oreilly.com/library/view/hands-on-machine-learning/9781492032632/

Week 2: September 11th-15th: Fundamentals of Deep Learning (2)

  • Softmax Regression
  • Deep Neural Networks
  • Dropout
  • Bayesian Networks
  • Model Regularization
  • Weight Initialization
  • Gradient Descent with Momentum
  • Batch Normalization
  • Model Evaluation

Readings:

  1. Géron, A. (2017). Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow (2019). https://www.oreilly.com/library/view/hands-on-machine-learning/9781492032632/
  2. Zhang, A., Lipton, Z. C., Li, M., & Smola, A. J. (2021). Dive into Deep Learning. http://arxiv.org/abs/2106.11342

 Week 3: September 18th – September 22nd:  Convolutional Neural Networks and Recurrent Neural Networks (RNN)

  • Convolutional Neural Network
  • Fundamentals of RNNs
  • LSTM Models
  • Application in Chemistry (Molecular Generation)
  • Dot Product Attention

 Readings:

 Hirohara, M., Saito, Y., Koda, Y., Sato, K., & Sakakibara, Y. (2018). Convolutional neural network based on SMILES representation of compounds for detecting chemical motif. BMC Bioinformatics, 19(19), 83–94. https://doi.org/10.1186/S12859-018-2523-5/FIGURES/7

  1. Bahdanau, D., Cho, K., & Bengio, Y. (2014). Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473.
  2. Sherstinsky, A. (2018). Fundamentals of Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) Network. Physica D: Nonlinear Phenomena, 404. https://doi.org/10.1016/j.physd.2019.132306
  3. Bjerrum, E. J., & Threlfall, R. (2017). Molecular Generation with Recurrent Neural Networks (RNNs). https://arxiv.org/abs/1705.04612v2
  4. Segler, M. H. S., Kogej, T., Tyrchan, C., & Waller, M. P. (2018). Generating focused molecule libraries for drug discovery with recurrent neural networks. ACS Central Science, 4(1), 120–131. https://doi.org/10.1021/ACSCENTSCI.7B00512

 Week 4: September 25th – 29th:  Representation Learning

  • Encoder-Decoder Architectures
  • Contrastive Learning
  • Dense Vector Retrieval
  • Transfer Learning
  • Unsupervised Machine Learning
  • Applications of Representation Learning in Chemistry

Readings:

  1. Jumper, J., Evans, R., Pritzel, A., Green, T., Figurnov, M., Ronneberger, O., Tunyasuvunakool, K., Bates, R., Žídek, A., Potapenko, A., Bridgland, A., Meyer, C., Kohl, S. A. A., Ballard, A. J., Cowie, A., Romera-Paredes, B., Nikolov, S., Jain, R., Adler, J., … Hassabis, D. (2021). Highly accurate protein structure prediction with AlphaFold. Nature 2021 596:7873, 596(7873), 583–589. https://doi.org/10.1038/s41586-021-03819-2
  2. Jaeger, S., Fulle, S., & Turk, S. (2018). Mol2vec: Unsupervised Machine Learning Approach with Chemical Intuition. Journal of Chemical Information and Modeling, 58(1), 27–35. https://doi.org/10.1021/ACS.JCIM.7B00616/
  3. Transfer Learning for Deep Learning
  4. Xu, Z., Wang, S., Zhu, F., & Huang, J. (2018). Seq2seq Fingerprint: An Unsupervised Deep Molecular Embedding for Drug Discovery. 17. https://doi.org/10.1145/3107411.3107424

 Week 5: October 2nd  – October 6th:  Transformers

  • Fundamentals of Transformers
  • Attention Mechanisms (multihead attention, talking heads attention)
  • Few-Shot Learning
  • Multi-Modal Transformers
  • Applications in Chemistry

Readings:

  1. Alammar, Jay. “The illustrated transformer.” The Illustrated Transformer–Jay Alammar–Visualizing Machine Learning One Concept at a Time 27 (2018).
  2. Irwin, R., Dimitriadis, S., He, J., & Bjerrum, E. J. (2022). Chemformer: a pre-trained transformer for computational chemistry. Machine Learning: Science and Technology, 3(1), 015022. https://doi.org/10.1088/2632-2153/AC3FFB
  3. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., … & Polosukhin, I. (2017). Attention is all you need. Advances in neural information processing systems, 30

Week 6: October 9th  – October 13th : Reading Week

Week 7: October 16th – October 20th : Graph Neural Networks (1)

  • Node embeddings, feature engineering
  • Graph Convolutional Neural Networks
  • Message Passing Neural Networks
  • Geometric Graph Learning
  • Applications in Chemistry

Readings:

  1. A Gentle Introduction to Graph Neural Networks (distill.pub)
  2. Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyals, O., & Dahl, G. E. (2017). Neural Message Passing for Quantum Chemistry (pp. 1263–1272). PMLR. https://proceedings.mlr.press/v70/gilmer17a.html
  3. Hamilton, William L., Rex Ying, and Jure Leskovec. “Representation learning on graphs: Methods and applications.” arXiv preprint arXiv:1709.05584 (2017).
  4. Yang, K., Swanson, K., Jin, W., Coley, C., Eiden, P., Gao, H., Guzman-Perez, A., Hopper, T., Kelley, B., Mathea, M., Palmer, A., Settels, V., Jaakkola, T., Jensen, K., & Barzilay, R. (2019). Analyzing Learned Molecular Representations for Property Prediction. Journal of Chemical Information and Modeling, 59(8), 3370–3388. https://doi.org/10.1021/ACS.JCIM.9B00237
  5. Stokes, J. M., Yang, K., Swanson, K., Jaakkola, T. S., Barzilay, R., Correspondence, J. J. C., Jin, W., Cubillos-Ruiz, A., Donghia, N. M., Macnair, C. R., French, S., Carfrae, L. A., Bloom-Ackermann, Z., Tran, V. M., Chiappino-Pepe, A., Badran, A. H., Andrews, I. W., Chory, E. J., Church, G. M., … Collins, J. J. (n.d.). A Deep Learning Approach to Antibiotic Discovery Article A Deep Learning Approach to Antibiotic Discovery. https://doi.org/10.1016/j.cell.2020.01.021

 Week 8: October 23rd – October 27th: Graph Neural Networks (2)

  • Heterogeneous Graph Machine Learning
  • Weisfeiler-Lehman graph isomorphism test
  • Advanced Graph Attention Mechanisms (Multiheaded attention, Talking Heads Attention)
  • Advanced Graph Neural Networks

 Readings:

  1. Michael Bronstein. Beyond Message Passing: a Physics-Inspired Paradigm for Graph Neural Networks. The Gradient.
  2. Ying, C., Cai, T., Luo, S., Zheng, S., Ke, G., He, D., Shen, Y., & Liu, T.-Y. (2021). Do Transformers Really Perform Bad for Graph Representation?
  3. Withnall, M., Lindelöf, E., Engkvist, O., & Chen, H. (2020). Building attention and edge message passing neural networks for bioactivity and physical-chemical property prediction. Journal of Cheminformatics, 12(1), 1–18. https://doi.org/10.1186/S13321-019-0407-Y/
  4. Wang, Y., Wang, J., Cao, Z., & Barati Farimani, A. (2022). Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 2022 4:3, 4(3), 279–287. https://doi.org/10.1038/s42256-022-00447-x

Week 9: October 30th   – November 3rd: Generative Models (1)

  • GANs
  • Generators and Discriminators
  • Augmented GANs
  • Application of GANs

Readings:

  1. Google Developer Advanced Course on GANs
  2. Sanchez-Lengeling, B., Outeiral, C., Guimaraes, G. L., & Aspuru-Guzik, A. (2017). Optimizing distributions over molecular space. An Objective-Reinforced Generative Adversarial Network for Inverse-design Chemistry (ORGANIC). https://doi.org/10.26434/CHEMRXIV.5309668.V3
  3. Kim, S., Noh, J., Gu, G. H., Aspuru-Guzik, A., & Jung, Y. (2020). Generative Adversarial Networks for Crystal Structure Prediction. ACS Central Science, 6(8), 1412–1420. https://doi.org/10.1021/ACSCENTSCI.0C00426/

 Week 10: November 6th – November 10th Generative Models (2)

  • Reinforcement Learning (RF), Briefly
  • Latent Diffusion Models
  • Generative Variational AutoEncoders

 Readings:

  1. Zhou, Z., Kearnes, S., Li, L., Zare, R. N., & Riley, P. (2019). Optimization of Molecules via Deep Reinforcement Learning. Scientific Reports 2019 9:1, 9(1), 1–10. https://doi.org/10.1038/s41598-019-47148-x
  2. Blaschke, T., Olivecrona, M., Engkvist, O., Bajorath, J., & Chen, H. (2018). Application of Generative Autoencoder in De Novo Molecular Design. Molecular Informatics, 37(1–2), 1700123. https://doi.org/10.1002/MINF.201700123
  3. Popova, M., Isayev, O., & Tropsha, A. (2018). Deep reinforcement learning for de novo drug design. Science Advances, 4(7). https://doi.org/10.1126/SCIADV.AAP7885/SUPPL_FILE/AAP7885_SM.PDF
  4. Rombach, R., Blattmann, A., Lorenz, D., Esser, P., & Ommer, B. (2022). High-resolution image synthesis with latent diffusion models. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (pp. 10684-10695).

 

Evaluation

Literature Review 4 × 10% (40%)

On weeks #2, #4, #6, and #8, students will choose one research paper and write 750 words to demonstrate a critical understanding of the novelty of the research and its application to geometric deep learning in chemistry.

Project: (60%)

Students will implement a deep learning model for some application in chemistry, drawing on techniques from the literature that were discussed in this course.

Project outlines (10%) must be submitted to the professor and approved before Monday, October 16th, 2023 at 11:59:59 PM

Project Presentations (50%) will be given to the professor and other students in the class on Monday, December 4th, 2023 or Tuesday, December 5th, 2023

Lab/Project

Week 11: November 13th – November 17th Project Work

Week 12: November 20th – November 24th Project Work

Week 13: November 27th – December 1st Project Work

Week 14: December 4th  – December 5th  Project Work

Schedule

  • Tue: 3:00 pm - 4:00 pm in C2 278 / Online

Office Hours

Telephone: 519-888-4567 x33022 E-mail: [email protected]