Mathematical discoveries take intuition and creativity – and now a little help from AI
A new work published in Nature with Petar Veličković and his colleagues shows how GNNs can guide the intuition of mathematicians in both representation theory & knot theory. In a more mathematical manunoscript they prove a combinatorial conjecture from 80s with the help of GNNs. This article also describes their discovery.
A new work published in Nature with Petar Veličković and his colleagues shows how GNNs can guide the intuition of mathematicians in both representation theory & knot theory. In a more mathematical manunoscript they prove a combinatorial conjecture from 80s with the help of GNNs. This article also describes their discovery.
Nature
Advancing mathematics by guiding human intuition with AI
Nature - A framework through which machine learning can guide mathematicians in discovering new conjectures and theorems is presented and shown to yield mathematical insight on important open...
Tutorial: Message Passing In Machine Learning
At NeurIPS 2021, there is a tutorial Message Passing In Machine Learning by Wee Sun Lee. It discusses in depth probabilistic graphical models (belief propagation, variational inference, mean field), markov decision processes (value iteration networks), GNNs and attention networks. It will be live today for registered people. Slides are available online.
At NeurIPS 2021, there is a tutorial Message Passing In Machine Learning by Wee Sun Lee. It discusses in depth probabilistic graphical models (belief propagation, variational inference, mean field), markov decision processes (value iteration networks), GNNs and attention networks. It will be live today for registered people. Slides are available online.
neurips.cc
NeurIPS 2021 Schedule
NeurIPS Website
Fresh picks from ArXiv
This week on ArXiv: library for graph recommendations, embeddings for molecules, and self-explaining GNNs 🦜
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Fast Graph Neural Tangent Kernel via Kronecker Sketching AAAI 22
* ProtGNN: Towards Self-Explaining Graph Neural Networks AAAI 22
* Graph4Rec: A Universal Toolkit with Graph Neural Networks for Recommender Systems
* Imbalanced Graph Classification via Graph-of-Graph Neural Networks
* Molecular Contrastive Learning with Chemical Element Knowledge Graph AAAI 22
* Learning Large-Time-Step Molecular Dynamics with Graph Neural Networks
This week on ArXiv: library for graph recommendations, embeddings for molecules, and self-explaining GNNs 🦜
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Fast Graph Neural Tangent Kernel via Kronecker Sketching AAAI 22
* ProtGNN: Towards Self-Explaining Graph Neural Networks AAAI 22
* Graph4Rec: A Universal Toolkit with Graph Neural Networks for Recommender Systems
* Imbalanced Graph Classification via Graph-of-Graph Neural Networks
* Molecular Contrastive Learning with Chemical Element Knowledge Graph AAAI 22
* Learning Large-Time-Step Molecular Dynamics with Graph Neural Networks
Open PhD/PostDoc Positions on GML
Aleksandar Bojchevski, whom you may know by his works on robustness, adversarial attacks, generative models on graphs, hires PhDs and PostDocs to work on trustworthy machine learning research. The position is at CISPA Helmholtz Center for Information Security, in Saarland, Germany and is well paid (for PhDs around 4K euros per month before taxes). Apply here.
Aleksandar Bojchevski, whom you may know by his works on robustness, adversarial attacks, generative models on graphs, hires PhDs and PostDocs to work on trustworthy machine learning research. The position is at CISPA Helmholtz Center for Information Security, in Saarland, Germany and is well paid (for PhDs around 4K euros per month before taxes). Apply here.
abojchevski.github.io
Aleksandar Bojchevski
Full Professor @ University of Cologne
Fresh picks from ArXiv
This week on ArXiv: belief propagation guarantees, scaling molecular GNNs, and debunking multi-label graph embeddings 🧙
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNN
* A Comparative Study on Robust Graph Neural Networks to Structural Noises
* Convergence of Generalized Belief Propagation Algorithm on Graphs with Motifs
* LUNAR: Unifying Local Outlier Detection Methods via Graph Neural Networks AAAI 2022
* A Self-supervised Mixed-curvature Graph Neural Network AAAI 2022
* Robustification of Online Graph Exploration Methods AAAI 2022
* Neural Belief Propagation for Scene Graph Generation
* Adaptive Kernel Graph Neural Network AAAI 2022
* On the Use of Unrealistic Predictions in Hundreds of Papers Evaluating Graph Representations AAAI 2022
* Scalable Geometric Deep Learning on Molecular Graphs
* OOD-GNN: Out-of-Distribution Generalized Graph Neural Network
This week on ArXiv: belief propagation guarantees, scaling molecular GNNs, and debunking multi-label graph embeddings 🧙
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNN
* A Comparative Study on Robust Graph Neural Networks to Structural Noises
* Convergence of Generalized Belief Propagation Algorithm on Graphs with Motifs
* LUNAR: Unifying Local Outlier Detection Methods via Graph Neural Networks AAAI 2022
* A Self-supervised Mixed-curvature Graph Neural Network AAAI 2022
* Robustification of Online Graph Exploration Methods AAAI 2022
* Neural Belief Propagation for Scene Graph Generation
* Adaptive Kernel Graph Neural Network AAAI 2022
* On the Use of Unrealistic Predictions in Hundreds of Papers Evaluating Graph Representations AAAI 2022
* Scalable Geometric Deep Learning on Molecular Graphs
* OOD-GNN: Out-of-Distribution Generalized Graph Neural Network
PhD positions at the University of Vienna
Nils M. Kriege, whom you may know from his works on graph kernels, TUDatasets, Weisfeiler-Leman embeddings and other important line of research in GML, now has two PhD positions at the University of Vienna. You can read more about these positions here and here. Application deadline December 20th. You can apply here.
Nils M. Kriege, whom you may know from his works on graph kernels, TUDatasets, Weisfeiler-Leman embeddings and other important line of research in GML, now has two PhD positions at the University of Vienna. You can read more about these positions here and here. Application deadline December 20th. You can apply here.
New Deep Learning Model Could Accelerate the Process of Discovering New Medicines
An article on recent GeoMol model that generates new molecules in an end-to-end, non-autoregressive and SE(3)-invariant fashion.
An article on recent GeoMol model that generates new molecules in an end-to-end, non-autoregressive and SE(3)-invariant fashion.
SciTechDaily
New Deep Learning Model Could Accelerate the Process of Discovering New Medicines
Taking Some of the Guesswork Out of Drug Discovery A deep learning model rapidly predicts the 3D shapes of drug-like molecules, which could accelerate the process of discovering new medicines. In their quest to discover effective new medicines, scientists…
Machine Learning and Simulation Science (MLSim) Group Positions
Mathias Niepert, whom you may know by the recent breakthrough that applies GML for cancer drug discovery, now hires PhDs and Postdocs at the University of Stuttgart. The application deadline is January 15th 2022. Find out more here.
Mathias Niepert, whom you may know by the recent breakthrough that applies GML for cancer drug discovery, now hires PhDs and Postdocs at the University of Stuttgart. The application deadline is January 15th 2022. Find out more here.
Telegram
Graph Machine Learning
Successful Phase I Cancer Vaccine Trials Powered by Graph ML
Transgene and NEC Corporation published a press release on the successful Phase I trials of TG4050, neoantigen cancer vaccine, tested on ovarian cancer, head and neck cancer. The release outlines…
Transgene and NEC Corporation published a press release on the successful Phase I trials of TG4050, neoantigen cancer vaccine, tested on ovarian cancer, head and neck cancer. The release outlines…
ICLR 2022 Workshops Announcement
The list of accepted workshops at ICLR 2022 has just been announced! There is a good bunch of GraphML-related workshops you can send your paper to:
- Geometrical and Topological Representation Learning
- Deep Learning on Graphs for Natural Language Processing
- Machine Learning for Drug Discovery (MLDD)
- Deep Generative Models for Highly Structured Data
- Workshop on the Elements of Reasoning: Objects, Structure and Causality
Workshop websites and calls are coming soon, stay tuned.
The list of accepted workshops at ICLR 2022 has just been announced! There is a good bunch of GraphML-related workshops you can send your paper to:
- Geometrical and Topological Representation Learning
- Deep Learning on Graphs for Natural Language Processing
- Machine Learning for Drug Discovery (MLDD)
- Deep Generative Models for Highly Structured Data
- Workshop on the Elements of Reasoning: Objects, Structure and Causality
Workshop websites and calls are coming soon, stay tuned.
Graph ML in 2022
The editors of the Graph ML channel proudly present the winter longread (in collaboration with Anton Tsitsulin and Anvar Kurmukov) including major research trends in 2021:
- Graph Transformers
- Equivariant GNNs
- Generative Models for Molecules
- GNNs and Combinatorial Optimization
- Subgraph GNNs
- Scalable and Deep GNNs
- Knowledge Graph Representation Learning
- Generally Cool Research with GNNs
Besides that, the post describes new datasets and challenges, new courses and books, as well as new / updated open source libraries for graph representation learning.
The editors of the Graph ML channel proudly present the winter longread (in collaboration with Anton Tsitsulin and Anvar Kurmukov) including major research trends in 2021:
- Graph Transformers
- Equivariant GNNs
- Generative Models for Molecules
- GNNs and Combinatorial Optimization
- Subgraph GNNs
- Scalable and Deep GNNs
- Knowledge Graph Representation Learning
- Generally Cool Research with GNNs
Besides that, the post describes new datasets and challenges, new courses and books, as well as new / updated open source libraries for graph representation learning.
Medium
Graph ML in 2022: Where Are We Now?
Hot trends and major advancements
GNN User Group Videos 2021
NVIDIA and AWS DGL teams wish you a wonderful Holiday Season and a Happy New Year and to stay connect with their 1,000+ members you can follow their slack channel. You may also watch the replays from our 25 global speakers who made the meetups possible in 2021.
• 12/9/2021 Session: Neptune ML: Graph Machine Learning meets Graph Database (Dr. Xiang Song, AWS AI Research and Education Lab & Joy Wang, Amazon Neptune) and Atomistic Line Graph Neural Network for improved materials property predictions (Dr. Kamal Choudhary, National Institute of Standards and Technology (NIST), Maryland).
• 10/28/2021 Session: Large-scale GNN training with DGL (Da Zheng Ph.D., Amazon) and New Trends and Results in Graph Federated Learning (Prof. Carl Yang, Emory University).
• 9/30/2021 Session: Unified Tensor - Enabling GPU-centric Data Access for Efficient Large Graph GNN Training (Seungwon Min, University of Illinois at Urbana-Champaign) and Challenges and Thinking in Go-production of GNN + DGL (Dr. Jian Zhang, AWS Shanghai AI Lab and AWS Machine Learning Solution Lab).
• 7/29/2021 Session: DGL 0.7 release (Dr. Minjie Wang, Amazon), Storing Node Features in GPU memory to speedup billion-scale GNN training (Dr. Dominique LaSalle, NVIDIA), Locally Private Graph Neural Networks (Sina Sajadmanesh, Idiap Research Institute, Switzerland) and Graph Embedding and Application in Meituan (Mengdi Zhang, Meituan).
• 6/24/2021 Session: Binary Graph Neural Networks and Dynamic Graph Models (Mehdi Bahri, Imperial College London) and Simplifying large-scale visual analysis of tricky data & models with GPUs, graphs, and ML (Leo Meyerovich, Graphistry Inc).
• 5/27/2021 Session: Graphite: GRAPH-Induced feaTure Extraction for Point Cloud Registration (Mahdi Saleh, TUM) Optimizing Graph Transformer Networks with Graph-based Techniques (Loc Hoang, University of Texas at Austin) and Encoding the Core Business Entities Using Meituan Brain (Mengdi Zhang, Meituan).
• 4/29/2021 Session: Boost then Convolve: Gradient Boosting Meets Graph Neural Networks (Dr. Sergey Ivanov, Criteo, Russia) and Inductive Representation Learning of Temporal Networks via Causal Anonymous Walks (Prof. Pan Li, Purdue University).
• 3/25/2021 Session: Therapeutics Data Commons: Machine Learning Datasets and Tasks for Therapeutics (Prof. Marinka Zitnik & Kexin Huang, Harvard University) and The Transformer Network for the Traveling Salesman Problem (Prof. Xavier Bresson, Nanyang Technological University (NTU), Singapore).
• 2/25/2021 Session: Gunrock: Graph Analytics on GPU (Dr. John Owens, University of California, Davis), NVIDIA CuGraph - An Open-Source Package for Graphs (Dr. Joe Eaton, NVIDIA) and Exploitation on Learning Mechanism of GNN (Dr. Chuan Shi, Beijing University of Posts and Telecommunications).
• 1/28/2021 Session: A Framework For Differentiable Discovery of Graph Algorithms (Dr. Le Song, Georgia Tech).
NVIDIA and AWS DGL teams wish you a wonderful Holiday Season and a Happy New Year and to stay connect with their 1,000+ members you can follow their slack channel. You may also watch the replays from our 25 global speakers who made the meetups possible in 2021.
• 12/9/2021 Session: Neptune ML: Graph Machine Learning meets Graph Database (Dr. Xiang Song, AWS AI Research and Education Lab & Joy Wang, Amazon Neptune) and Atomistic Line Graph Neural Network for improved materials property predictions (Dr. Kamal Choudhary, National Institute of Standards and Technology (NIST), Maryland).
• 10/28/2021 Session: Large-scale GNN training with DGL (Da Zheng Ph.D., Amazon) and New Trends and Results in Graph Federated Learning (Prof. Carl Yang, Emory University).
• 9/30/2021 Session: Unified Tensor - Enabling GPU-centric Data Access for Efficient Large Graph GNN Training (Seungwon Min, University of Illinois at Urbana-Champaign) and Challenges and Thinking in Go-production of GNN + DGL (Dr. Jian Zhang, AWS Shanghai AI Lab and AWS Machine Learning Solution Lab).
• 7/29/2021 Session: DGL 0.7 release (Dr. Minjie Wang, Amazon), Storing Node Features in GPU memory to speedup billion-scale GNN training (Dr. Dominique LaSalle, NVIDIA), Locally Private Graph Neural Networks (Sina Sajadmanesh, Idiap Research Institute, Switzerland) and Graph Embedding and Application in Meituan (Mengdi Zhang, Meituan).
• 6/24/2021 Session: Binary Graph Neural Networks and Dynamic Graph Models (Mehdi Bahri, Imperial College London) and Simplifying large-scale visual analysis of tricky data & models with GPUs, graphs, and ML (Leo Meyerovich, Graphistry Inc).
• 5/27/2021 Session: Graphite: GRAPH-Induced feaTure Extraction for Point Cloud Registration (Mahdi Saleh, TUM) Optimizing Graph Transformer Networks with Graph-based Techniques (Loc Hoang, University of Texas at Austin) and Encoding the Core Business Entities Using Meituan Brain (Mengdi Zhang, Meituan).
• 4/29/2021 Session: Boost then Convolve: Gradient Boosting Meets Graph Neural Networks (Dr. Sergey Ivanov, Criteo, Russia) and Inductive Representation Learning of Temporal Networks via Causal Anonymous Walks (Prof. Pan Li, Purdue University).
• 3/25/2021 Session: Therapeutics Data Commons: Machine Learning Datasets and Tasks for Therapeutics (Prof. Marinka Zitnik & Kexin Huang, Harvard University) and The Transformer Network for the Traveling Salesman Problem (Prof. Xavier Bresson, Nanyang Technological University (NTU), Singapore).
• 2/25/2021 Session: Gunrock: Graph Analytics on GPU (Dr. John Owens, University of California, Davis), NVIDIA CuGraph - An Open-Source Package for Graphs (Dr. Joe Eaton, NVIDIA) and Exploitation on Learning Mechanism of GNN (Dr. Chuan Shi, Beijing University of Posts and Telecommunications).
• 1/28/2021 Session: A Framework For Differentiable Discovery of Graph Algorithms (Dr. Le Song, Georgia Tech).
YouTube
Graph Neural Networks User Group Meetup Thursday 12/9/2021
Agenda 12/9/2021:
4:00 - 4:05 PM (PST): Welcome and Updates.
4:05 - 4:35PM (PST): Neptune ML: Graph Machine Learning meets Graph Database (Dr. Xiang Song, AWS AI Research and Education Lab & Joy Wang, Amazon Neptune).
4:35 - 5:05 PM (PST): Atomistic Line…
4:00 - 4:05 PM (PST): Welcome and Updates.
4:05 - 4:35PM (PST): Neptune ML: Graph Machine Learning meets Graph Database (Dr. Xiang Song, AWS AI Research and Education Lab & Joy Wang, Amazon Neptune).
4:35 - 5:05 PM (PST): Atomistic Line…
What does 2022 hold for Geometric & Graph ML?
Michael Bronstein and Petar Veličković released a huge post summarizing the state of Graph ML in 2021 and predicting possible breakthroughs in 2022. Even more, the authors conducted a large-scale community study and interviewed many prominent researchers discussing 11 key aspects:
- Rising importance of geometry in ML
- Message passing GNNs are still dominating
- Differential equations power new GNN architectures
- Old ideas from signal processing, neuroscience, and physics strike back
- Modeling complex systems with higher-order structures
- Reasoning, axiomatisation, and generalisation are still challenging
- Graphs in Reinforcement Learning
- AlphaFold 2 is a paradigm shift in structural biology
- Progress in graph transformer architectures
- Drug discovery with Geometric and Graph ML
- Quantum ML + graph-based methods
And 130 references 😉
Michael Bronstein and Petar Veličković released a huge post summarizing the state of Graph ML in 2021 and predicting possible breakthroughs in 2022. Even more, the authors conducted a large-scale community study and interviewed many prominent researchers discussing 11 key aspects:
- Rising importance of geometry in ML
- Message passing GNNs are still dominating
- Differential equations power new GNN architectures
- Old ideas from signal processing, neuroscience, and physics strike back
- Modeling complex systems with higher-order structures
- Reasoning, axiomatisation, and generalisation are still challenging
- Graphs in Reinforcement Learning
- AlphaFold 2 is a paradigm shift in structural biology
- Progress in graph transformer architectures
- Drug discovery with Geometric and Graph ML
- Quantum ML + graph-based methods
And 130 references 😉
Medium
What does 2022 hold for Geometric & Graph ML?
Leading researchers in Geometric & Graph ML summarise the progress in 2021 and make predictions for 2022
Postdoc position at Univerity Milano-Bicocca with Dimitri Ognibene
University Milano-Bicocca is offering 1 postdoctoral position on machine learning and AI applied to understand and contrast social media threats for teenagers. The successful participant will be involved in the multidisciplinary project COURAGE funded by Volkswagen Foundation.
The position is research only starting in April 2022
Application Closes: T.B.D. (about mid-February)
Starts: April - May 2022
Duration: 1.5 years
Salary: € 3100 pm after tax
Please contact us for expression of interest and preliminary information:
dimitri.ognibene@unimib.it (PI)
f.lomonaco5@campus.unimib.it
Topics: Social media, NLP, machine learning, cognitive models, opinion dynamics, computer vision, reinforcement learning, graph neural network, recommender systems, network analysis
Project: COURAGE
Informal Denoscription: https://sites.google.com/site/dimitriognibenehomepage/jobs
University Milano-Bicocca is offering 1 postdoctoral position on machine learning and AI applied to understand and contrast social media threats for teenagers. The successful participant will be involved in the multidisciplinary project COURAGE funded by Volkswagen Foundation.
The position is research only starting in April 2022
Application Closes: T.B.D. (about mid-February)
Starts: April - May 2022
Duration: 1.5 years
Salary: € 3100 pm after tax
Please contact us for expression of interest and preliminary information:
dimitri.ognibene@unimib.it (PI)
f.lomonaco5@campus.unimib.it
Topics: Social media, NLP, machine learning, cognitive models, opinion dynamics, computer vision, reinforcement learning, graph neural network, recommender systems, network analysis
Project: COURAGE
Informal Denoscription: https://sites.google.com/site/dimitriognibenehomepage/jobs
Google
Dimitri Ognibene's Homepage - Jobs
Currently Open Positions
Job openings - ML for molecule design at Roche, Zurich
The Swiss ML group within Prescient Design/Roche is looking for (graph) ML and (computational/structural) biology (BIO) researchers. Apply if you want to use your neural network skills to design macromolecules & help improve lives.
More information: bit.ly/3HbqQ0w (ML scientist), bit.ly/3BG0ra2 (computational biologist)
Team: www.prescient.design, a member of the Roche group
Location: Zurich/Switzerland
Relevant work: bit.ly/36mimae
Timing: asap
Contact: andreas.loukas[at]roche.com
Keywords: machine learning, graph neural networks, geometric deep learning, generative models, invariant/equivariant neural nets, reinforcement learning, protein design, rosetta
The Swiss ML group within Prescient Design/Roche is looking for (graph) ML and (computational/structural) biology (BIO) researchers. Apply if you want to use your neural network skills to design macromolecules & help improve lives.
More information: bit.ly/3HbqQ0w (ML scientist), bit.ly/3BG0ra2 (computational biologist)
Team: www.prescient.design, a member of the Roche group
Location: Zurich/Switzerland
Relevant work: bit.ly/36mimae
Timing: asap
Contact: andreas.loukas[at]roche.com
Keywords: machine learning, graph neural networks, geometric deep learning, generative models, invariant/equivariant neural nets, reinforcement learning, protein design, rosetta
New Release of DGL v0.8
One of the most prominent libraries for graph machine learning, DGL, has recently released an updated v0.8 with new features as well as improvement on system performance. The highlights are:
- A major update of the mini-batch sampling pipeline.
- Significant acceleration and code simplification of heterogeneous GNNs and link-prediction models.
- GNNLens: a DGL empowered tool for GNN explanability.
- New functions to create, transform and augment graph datasets, e.g. in graph contrastive learning or repurposing a graph for different tasks.
- DGL-Go: a new GNN model training command line tool for quick experiments with SOTA models.
https://www.dgl.ai/release/2022/03/01/release.html
One of the most prominent libraries for graph machine learning, DGL, has recently released an updated v0.8 with new features as well as improvement on system performance. The highlights are:
- A major update of the mini-batch sampling pipeline.
- Significant acceleration and code simplification of heterogeneous GNNs and link-prediction models.
- GNNLens: a DGL empowered tool for GNN explanability.
- New functions to create, transform and augment graph datasets, e.g. in graph contrastive learning or repurposing a graph for different tasks.
- DGL-Go: a new GNN model training command line tool for quick experiments with SOTA models.
https://www.dgl.ai/release/2022/03/01/release.html
www.dgl.ai
Deep Graph Library
Library for deep learning on graphs
Recent Advances in Efficient and Scalable Graph Neural Networks
A new research blogpost by Chaitanya K. Joshi overviewing the toolbox for Graph Neural Networks to scale to real-world graphs and real-time applications.
Training and deploying GNNs to handle real-world graph data poses several theoretical and engineering challenges:
1. Giant Graphs – Memory Limitations
2. Sparse Computations – Hardware Limitations
3. Graph Subsampling – Reliability Limitations
The blogpost introduces three simple but effective ideas in the 'toolbox' for developing efficient and scalable GNNs:
- Data Preparation - From sampling large-scale graphs to CPU-GPU hybrid training via historical node embedding lookups.
- Efficient Architectures - Graph-augmented MLPs for scaling to giant networks, and efficient graph convolution designs for real-time inference on batches of graph data.
- Learning Paradigms - Combining Quantization Aware Training (low precision model weights and activations) with Knowledge Distillation (improving efficient GNNs using expressive teacher models) for maximizing inference latency as well as performance.
Blogpost: https://www.chaitjo.com/post/efficient-gnns/
A new research blogpost by Chaitanya K. Joshi overviewing the toolbox for Graph Neural Networks to scale to real-world graphs and real-time applications.
Training and deploying GNNs to handle real-world graph data poses several theoretical and engineering challenges:
1. Giant Graphs – Memory Limitations
2. Sparse Computations – Hardware Limitations
3. Graph Subsampling – Reliability Limitations
The blogpost introduces three simple but effective ideas in the 'toolbox' for developing efficient and scalable GNNs:
- Data Preparation - From sampling large-scale graphs to CPU-GPU hybrid training via historical node embedding lookups.
- Efficient Architectures - Graph-augmented MLPs for scaling to giant networks, and efficient graph convolution designs for real-time inference on batches of graph data.
- Learning Paradigms - Combining Quantization Aware Training (low precision model weights and activations) with Knowledge Distillation (improving efficient GNNs using expressive teacher models) for maximizing inference latency as well as performance.
Blogpost: https://www.chaitjo.com/post/efficient-gnns/
Chaitanya K. Joshi
PhD Student
Learning on Graphs with Missing Node Features
A new paper and associated blogpost by Emanuele Rossi and Prof. Michael Bronstein.
Most Graph Neural Networks typically run under the assumption of a full set of features available for all nodes. In real-world scenarios features are often only partially available (for example, in social networks, age and gender can be known only for a small subset of users). Feature Propagation is an efficient and scalable approach for handling missing features in graph machine learning applications that works surprisingly well despite its simplicity.
📝 Blog Post: https://bit.ly/3ILn1Rl
💻 Code: https://bit.ly/3J9ftbr
🎥 Recording: https://bit.ly/3CbBvHW
📖 Slides: https://bit.ly/3Mh5geW
📜 Paper: https://bit.ly/3Kgo4JE
A new paper and associated blogpost by Emanuele Rossi and Prof. Michael Bronstein.
Most Graph Neural Networks typically run under the assumption of a full set of features available for all nodes. In real-world scenarios features are often only partially available (for example, in social networks, age and gender can be known only for a small subset of users). Feature Propagation is an efficient and scalable approach for handling missing features in graph machine learning applications that works surprisingly well despite its simplicity.
📝 Blog Post: https://bit.ly/3ILn1Rl
💻 Code: https://bit.ly/3J9ftbr
🎥 Recording: https://bit.ly/3CbBvHW
📖 Slides: https://bit.ly/3Mh5geW
📜 Paper: https://bit.ly/3Kgo4JE
Medium
Feature Propagation is a simple and surprisingly efficient solution for learning on graphs with missing node features
Feature Propagation is a simple and surprisingly efficient solution for learning on graphs with missing node features
The Exact Class of Graph Functions Generated by Graph Neural Networks by Mohammad Fereydounian (UPenn), Hamed Hassani (UPenn), Javid Dadashkarimi (Yale), and Amin Karbasi (Yale).
ArXiv: https://arxiv.org/abs/2202.08833
A recent pre-print discussing the connections between Graph Neural Networks (GNNs) and Dynamic Programming (DP).
The paper asks: Given a graph function, defined on an arbitrary set of edge weights and node features, is there a GNN whose output is identical to the graph function?
They show that many graph problems, e.g. min-cut value, max-flow value, and max-clique size, can be represented by a GNN. Additionally, there exist simple graphs for which no GNN can correctly find the length of the shortest paths between all nodes (a classic DP problem).
The paper's main claim is that this negative example shows that DP and GNNs are misaligned, even though (conceptually) they follow very similar iterative procedures.
This claim has been hotly debated by Graph ML Twitter, with many interesting perspectives, e.g. see the original Tweet and subsequent discussions by L. Cotta and P. Veličković.
ArXiv: https://arxiv.org/abs/2202.08833
A recent pre-print discussing the connections between Graph Neural Networks (GNNs) and Dynamic Programming (DP).
The paper asks: Given a graph function, defined on an arbitrary set of edge weights and node features, is there a GNN whose output is identical to the graph function?
They show that many graph problems, e.g. min-cut value, max-flow value, and max-clique size, can be represented by a GNN. Additionally, there exist simple graphs for which no GNN can correctly find the length of the shortest paths between all nodes (a classic DP problem).
The paper's main claim is that this negative example shows that DP and GNNs are misaligned, even though (conceptually) they follow very similar iterative procedures.
This claim has been hotly debated by Graph ML Twitter, with many interesting perspectives, e.g. see the original Tweet and subsequent discussions by L. Cotta and P. Veličković.
Twitter
Hamed Hassani
Can GNNs compute the shortest path? Or min-cut? Or more generally, are GNNs aligned with dynamic programs? We provide an answer in arxiv.org/pdf/2202.08833…; joint work with Mohammad Fereydounian, @JDadashkarimi, and @aminkarbasi 1/
'Graph Neural Networks through the lens of algebraic topology, differential geometry, and PDEs'
A recent talk by Prof. Michael Bronstein (University of Oxford, Twitter), delivered in-person at the Computer Laboratory, University of Cambridge.
The talk is centred around the idea that graphs can be viewed as a discretisation of an underlying continuous manifold. This physics-inspired approach opens up a new trove of tools from the fields of differential geometry, algebraic topology, and differential equations so far largely unexplored in graph ML.
Recording: https://www.cl.cam.ac.uk/seminars/wednesday/video/20220309-1500-t170978.html
Associated Blogpost: https://towardsdatascience.com/graph-neural-networks-beyond-weisfeiler-lehman-and-vanilla-message-passing-bc8605fa59a
A recent talk by Prof. Michael Bronstein (University of Oxford, Twitter), delivered in-person at the Computer Laboratory, University of Cambridge.
The talk is centred around the idea that graphs can be viewed as a discretisation of an underlying continuous manifold. This physics-inspired approach opens up a new trove of tools from the fields of differential geometry, algebraic topology, and differential equations so far largely unexplored in graph ML.
Recording: https://www.cl.cam.ac.uk/seminars/wednesday/video/20220309-1500-t170978.html
Associated Blogpost: https://towardsdatascience.com/graph-neural-networks-beyond-weisfeiler-lehman-and-vanilla-message-passing-bc8605fa59a
Medium
Graph Neural Networks beyond Weisfeiler-Lehman and vanilla Message Passing
Physics-inspired continuous learning models on graphs allow to overcome the limitations of traditional GNNs
Organizational update
We are very happy to share that Chaitanya K. Joshi agreed to be one of the admins for the channel. He was already involved in several posts here and made interesting blog posts. He is currently a PhD student at the University of Cambridge, supervised by Prof. Pietro Liò. His research explores the intersection of Graph and Geometric Deep Learning with applications in biology and drug discovery. He previously worked on Graph Neural Network architectures and applications in Combinatorial Optimization at the NTU Graph Deep Learning Lab and at A*STAR, Singapore, together with Prof. Xavier Bresson. Please, welcome Chaitanya and if you have something to share do not hesitate to reach him out.
We are very happy to share that Chaitanya K. Joshi agreed to be one of the admins for the channel. He was already involved in several posts here and made interesting blog posts. He is currently a PhD student at the University of Cambridge, supervised by Prof. Pietro Liò. His research explores the intersection of Graph and Geometric Deep Learning with applications in biology and drug discovery. He previously worked on Graph Neural Network architectures and applications in Combinatorial Optimization at the NTU Graph Deep Learning Lab and at A*STAR, Singapore, together with Prof. Xavier Bresson. Please, welcome Chaitanya and if you have something to share do not hesitate to reach him out.
Chaitanya K. Joshi
PhD Student
📃 Fresh Picks from ArXiv
The past week on the GraphML ArXiv digest: A flurry of new survey papers, GNNs for molecular property prediction and NLP/KG, as well as new avenues in GNN modelling.
📚 Surveys:
- Generative models for molecular discovery: Recentadvances and challenges. ft. Wengong Jin, Tommi Jaakkola, Regina Barzilay.
- Explainability in Graph Neural Networks: An Experimental Survey.
- A Survey on Deep Graph Generation: Methods and Applications.
- Knowledge Graph Embedding Methods for Entity Alignment: An Experimental Review.
- Few-Shot Learning on Graphs: A Survey.
🧬 GNNs for Science:
- Protein Representation Learning by Geometric Structure Pretraining. ft. Jian Tang.
- Multimodal Learning on Graphs for Disease Relation Extraction. ft. Marinka Zitnik.
- MolNet: A Chemically Intuitive Graph Neural Network for Prediction of Molecular Properties.
- Simulating Liquids with Graph Networks.
🗣 GNNs for NLP and Knowledge Graphs:
- A Unified Framework for Rank-based Evaluation Metrics for Link Prediction in Knowledge Graphs. ft. Mikhail Galkin.
- Context-Dependent Anomaly Detection with Knowledge Graph Embedding Models.
- AdaLoGN: Adaptive Logic Graph Network for Reasoning-Based Machine Reading Comprehension.
- HeterMPC: A Heterogeneous Graph Neural Network for Response Generation in Multi-Party Conversations.
🌐 GNN Modelling and Applications:
- GRAND+: Scalable Graph Random Neural Networks. ft. Jie Tang.
- Graph Representation Learning with Individualization and Refinement. ft. Lee Wee Sun.
- Graph Augmentation Learning.
- SoK: Differential Privacy on Graph-Structured Data.
- Incorporating Heterophily into Graph Neural Networks for Graph Classification.
- Supervised Contrastive Learning with Structure Inference for Graph Classification.
(If I forgot to mention your paper, please shoot me a message and I will update the post. We will be trying to resume the 'Fresh Picks from ArXiv' series every Monday morning!)
The past week on the GraphML ArXiv digest: A flurry of new survey papers, GNNs for molecular property prediction and NLP/KG, as well as new avenues in GNN modelling.
📚 Surveys:
- Generative models for molecular discovery: Recentadvances and challenges. ft. Wengong Jin, Tommi Jaakkola, Regina Barzilay.
- Explainability in Graph Neural Networks: An Experimental Survey.
- A Survey on Deep Graph Generation: Methods and Applications.
- Knowledge Graph Embedding Methods for Entity Alignment: An Experimental Review.
- Few-Shot Learning on Graphs: A Survey.
🧬 GNNs for Science:
- Protein Representation Learning by Geometric Structure Pretraining. ft. Jian Tang.
- Multimodal Learning on Graphs for Disease Relation Extraction. ft. Marinka Zitnik.
- MolNet: A Chemically Intuitive Graph Neural Network for Prediction of Molecular Properties.
- Simulating Liquids with Graph Networks.
🗣 GNNs for NLP and Knowledge Graphs:
- A Unified Framework for Rank-based Evaluation Metrics for Link Prediction in Knowledge Graphs. ft. Mikhail Galkin.
- Context-Dependent Anomaly Detection with Knowledge Graph Embedding Models.
- AdaLoGN: Adaptive Logic Graph Network for Reasoning-Based Machine Reading Comprehension.
- HeterMPC: A Heterogeneous Graph Neural Network for Response Generation in Multi-Party Conversations.
🌐 GNN Modelling and Applications:
- GRAND+: Scalable Graph Random Neural Networks. ft. Jie Tang.
- Graph Representation Learning with Individualization and Refinement. ft. Lee Wee Sun.
- Graph Augmentation Learning.
- SoK: Differential Privacy on Graph-Structured Data.
- Incorporating Heterophily into Graph Neural Networks for Graph Classification.
- Supervised Contrastive Learning with Structure Inference for Graph Classification.
(If I forgot to mention your paper, please shoot me a message and I will update the post. We will be trying to resume the 'Fresh Picks from ArXiv' series every Monday morning!)