Notes on: Gilmer, J., Schoenholz, S. S., Riley, P. F., Vinyals, O., & Dahl, G. E. (2017): Neural message passing for quantum chemistry

Table of Contents

Terminology

density functional theory (DFT)
quantum mechianical simulation method

Message Passing Neural Networks (MPNNs)

  • Operate on undirected graphs gilmer17_neural_messag_passin_quant_chemis_2de98136973021abb46a5a3fc1e4318bafb84264.png with node features gilmer17_neural_messag_passin_quant_chemis_3eb230b1d55a886b1535c1b9863abc59e9950545.png and edge features gilmer17_neural_messag_passin_quant_chemis_caa298422c56a5f4d0b1536625bca42d40b5bdd2.png
  • Runs for gilmer17_neural_messag_passin_quant_chemis_b3731d37cd447bd6c31809075a6be43b3d0b04ec.png time stemps
  • Defined in terms of message functions gilmer17_neural_messag_passin_quant_chemis_03009d501d50541f40827ea6b9b7f19c1de9e2ad.png and vertex update functions gilmer17_neural_messag_passin_quant_chemis_072c9b33ea90827ad034120f2304f2e214c6e325.png
  • hidden states gilmer17_neural_messag_passin_quant_chemis_a3750c3570843d88151aa303722e57908f3e086b.png at each node in the graph are updated base on messages gilmer17_neural_messag_passin_quant_chemis_e21d08bd08fef13d19b3176d2392f46b16a6681f.png according to

    gilmer17_neural_messag_passin_quant_chemis_fe528ee6ec9429a8457d09d09eebc3037cae5cf6.png

    where in the sum, gilmer17_neural_messag_passin_quant_chemis_6832ae86456bc62ec3b200467f0e46b836b7bbb4.png denotes the neighbors of gilmer17_neural_messag_passin_quant_chemis_ebac26f7eba81eb516f423f028596d8e6018aa7a.png

  • Readout phase computes a feature vector for the whole grap using some readout function gilmer17_neural_messag_passin_quant_chemis_6dd4ec21f92a21e77e43374b4c7fd1cc8c2234c6.png according to

    gilmer17_neural_messag_passin_quant_chemis_cf4d5c471125bbe777214b097ed597e78c96edbf.png

  • Learned: message functions gilmer17_neural_messag_passin_quant_chemis_03009d501d50541f40827ea6b9b7f19c1de9e2ad.png, vertex update functions gilmer17_neural_messag_passin_quant_chemis_072c9b33ea90827ad034120f2304f2e214c6e325.png, and readout functon gilmer17_neural_messag_passin_quant_chemis_6dd4ec21f92a21e77e43374b4c7fd1cc8c2234c6.png are all learned differentiable functions

Could also learn edge features in the graph gilmer17_neural_messag_passin_quant_chemis_1ea1ef8e8ea92b286a1eb26485a8a63b3f978dbe.png and updating them analogously to the update equations above.

Other research framed in the form of MPNNs

Kipf & Welling (2016)

In this case we have:

gilmer17_neural_messag_passin_quant_chemis_8d9a7c55749b7b345586253a66de7b7fb98b7de7.png

Checkout the supplementary material (section 10.1.1) in gilmer17_neural_messag_passin_quant_chemis for a the specific deduction of why kipf16_semi_super_class_with_graph_convol_networ corresponds to a MPNN! It's great.