The message-passing paradigm has been the “battle horse” of deep learning on graphs for several years, making graph neural networks a big success in a wide range of applications, from particle physics to protein design. From a theoretical viewpoint, it established the link to the Weisfeiler-Lehman hierarchy, allowing to analyse the expressive power of GNNs. We argue that the very “node-and-edge”-centric mindset of current graph deep learning schemes may hinder future progress in the field. As an alternative, we propose physics-inspired “continuous” learning models that open up a new trove of tools from the fields of differential geometry, algebraic topology, and differential equations so far largely unexplored in graph ML.
Part of the LMS/IMA Joint Meeting on 'The Mathematical Foundations' of AI, which took place on Friday 13 October 2023 at De Morgan House, London and online via Zoom.
==========
The London Mathematical Society has, since 1865, been the UK's learned society for the advancement, dissemination and promotion of mathematical knowledge. Our mission is to advance mathematics through our members and the broader scientific community worldwide.
For further information:
► Website: https://www.lms.ac.uk/
► Events: https://www.lms.ac.uk/events
► Grants and Prizes: https://www.lms.ac.uk/grants-prizes
► Publications: https://www.lms.ac.uk/publications
► Membership: https://www.lms.ac.uk/membership
Follow us:
► Twitter: https://twitter.com/LondMathSoc
► Facebook: https://www.facebook.com/londonmathematicalsociety
► LinkedIn: https://www.linkedin.com/company/the-london-mathematical-society/
► Youtube: @LondonMathematicalSociety
0 Comments