Graph Machine Learning – Telegram
Graph Machine Learning
6.71K subscribers
53 photos
11 files
808 links
Everything about graph theory, computer science, machine learning, etc.


If you have something worth sharing with the community, reach out @gimmeblues, @chaitjo.

Admins: Sergey Ivanov; Michael Galkin; Chaitanya K. Joshi
Download Telegram
Fresh picks from ArXiv
This week on ArXiv: feature propagation to alleviate missing node features, new sota for molecular prediction, and benchmarks on GNN explanations 👴
If I forgot to mention your paper, please shoot me a message and I will update the post.

GNNs
* Multi-fidelity Stability for Graph Representation Learning with Joan Bruna
* AutoHEnsGNN: Winning Solution to AutoGraph Challenge for KDD Cup 2020
* Demystifying Graph Neural Network Explanations
* Unsupervised Learning for Identifying High Eigenvector Centrality Nodes: A Graph Neural Network Approach
* On the Unreasonable Effectiveness of Feature propagation in Learning on Graphs with Missing Node Features with Michael Bronstein
* Directional Message Passing on Molecular Graphs via Synthetic Coordinates with Stephan Günnemann
Over-squashing, Bottlenecks, and Graph Ricci curvature

A second post by Michael Bronstein about over-squashing effect, when exponentially many neighbors aggregate their information into a fix-sized vector, causing the loss of the information. In this post, Michael connects over-squashing effect with Ricci flow — a well-studied notion of the space curvature that is used in differential geometry.
Connected Data World 2021

Today at 14-00 (Paris time), I will be in the panel at Connected Data World conference, with a great line of graph ML researchers to talk about applications of GNNs. Besides there are many other interesting talks talking about ML, knowledge graphs, graph databases, and more. If you want to attend, you need to register (there is a free streaming track and also a discount code CDW21SPEAKERA20 for a full program).
Mathematical discoveries take intuition and creativity – and now a little help from AI

A new work published in Nature with Petar Veličković and his colleagues shows how GNNs can guide the intuition of mathematicians in both representation theory & knot theory. In a more mathematical manunoscript they prove a combinatorial conjecture from 80s with the help of GNNs. This article also describes their discovery.
Tutorial: Message Passing In Machine Learning

At NeurIPS 2021, there is a tutorial Message Passing In Machine Learning by Wee Sun Lee. It discusses in depth probabilistic graphical models (belief propagation, variational inference, mean field), markov decision processes (value iteration networks), GNNs and attention networks. It will be live today for registered people. Slides are available online.
Open PhD/PostDoc Positions on GML

Aleksandar Bojchevski, whom you may know by his works on robustness, adversarial attacks, generative models on graphs, hires PhDs and PostDocs to work on trustworthy machine learning research. The position is at CISPA Helmholtz Center for Information Security, in Saarland, Germany and is well paid (for PhDs around 4K euros per month before taxes). Apply here.
PhD positions at the University of Vienna

Nils M. Kriege, whom you may know from his works on graph kernels, TUDatasets, Weisfeiler-Leman embeddings and other important line of research in GML, now has two PhD positions at the University of Vienna. You can read more about these positions here and here. Application deadline December 20th. You can apply here.
Machine Learning and Simulation Science (MLSim) Group Positions

Mathias Niepert, whom you may know by the recent breakthrough that applies GML for cancer drug discovery, now hires PhDs and Postdocs at the University of Stuttgart. The application deadline is January 15th 2022. Find out more here.
ICLR 2022 Workshops Announcement

The list of accepted workshops at ICLR 2022 has just been announced! There is a good bunch of GraphML-related workshops you can send your paper to:

- Geometrical and Topological Representation Learning
- Deep Learning on Graphs for Natural Language Processing
- Machine Learning for Drug Discovery (MLDD)
- Deep Generative Models for Highly Structured Data
- Workshop on the Elements of Reasoning: Objects, Structure and Causality

Workshop websites and calls are coming soon, stay tuned.
Graph ML in 2022

The editors of the Graph ML channel proudly present the winter longread (in collaboration with Anton Tsitsulin and Anvar Kurmukov) including major research trends in 2021:

- Graph Transformers
- Equivariant GNNs
- Generative Models for Molecules
- GNNs and Combinatorial Optimization
- Subgraph GNNs
- Scalable and Deep GNNs
- Knowledge Graph Representation Learning
- Generally Cool Research with GNNs

Besides that, the post describes new datasets and challenges, new courses and books, as well as new / updated open source libraries for graph representation learning.
GNN User Group Videos 2021

NVIDIA and AWS DGL teams wish you a wonderful Holiday Season and a Happy New Year and to stay connect with their 1,000+ members you can follow their slack channel. You may also watch the replays from our 25 global speakers who made the meetups possible in 2021.


12/9/2021 Session: Neptune ML: Graph Machine Learning meets Graph Database (Dr. Xiang Song, AWS AI Research and Education Lab & Joy Wang, Amazon Neptune) and Atomistic Line Graph Neural Network for improved materials property predictions (Dr. Kamal Choudhary, National Institute of Standards and Technology (NIST), Maryland).

10/28/2021 Session: Large-scale GNN training with DGL (Da Zheng Ph.D., Amazon) and New Trends and Results in Graph Federated Learning (Prof. Carl Yang, Emory University).

9/30/2021 Session: Unified Tensor - Enabling GPU-centric Data Access for Efficient Large Graph GNN Training (Seungwon Min, University of Illinois at Urbana-Champaign) and Challenges and Thinking in Go-production of GNN + DGL (Dr. Jian Zhang, AWS Shanghai AI Lab and AWS Machine Learning Solution Lab).

7/29/2021 Session: DGL 0.7 release (Dr. Minjie Wang, Amazon), Storing Node Features in GPU memory to speedup billion-scale GNN training (Dr. Dominique LaSalle, NVIDIA), Locally Private Graph Neural Networks (Sina Sajadmanesh, Idiap Research Institute, Switzerland) and Graph Embedding and Application in Meituan (Mengdi Zhang, Meituan).

6/24/2021 Session: Binary Graph Neural Networks and Dynamic Graph Models (Mehdi Bahri, Imperial College London) and Simplifying large-scale visual analysis of tricky data & models with GPUs, graphs, and ML (Leo Meyerovich, Graphistry Inc).

5/27/2021 Session: Graphite: GRAPH-Induced feaTure Extraction for Point Cloud Registration (Mahdi Saleh, TUM) Optimizing Graph Transformer Networks with Graph-based Techniques (Loc Hoang, University of Texas at Austin) and Encoding the Core Business Entities Using Meituan Brain (Mengdi Zhang, Meituan).

4/29/2021 Session: Boost then Convolve: Gradient Boosting Meets Graph Neural Networks (Dr. Sergey Ivanov, Criteo, Russia) and Inductive Representation Learning of Temporal Networks via Causal Anonymous Walks (Prof. Pan Li, Purdue University).

3/25/2021 Session: Therapeutics Data Commons: Machine Learning Datasets and Tasks for Therapeutics (Prof. Marinka Zitnik & Kexin Huang, Harvard University) and The Transformer Network for the Traveling Salesman Problem (Prof. Xavier Bresson, Nanyang Technological University (NTU), Singapore).

2/25/2021 Session: Gunrock: Graph Analytics on GPU (Dr. John Owens, University of California, Davis), NVIDIA CuGraph - An Open-Source Package for Graphs (Dr. Joe Eaton, NVIDIA) and Exploitation on Learning Mechanism of GNN (Dr. Chuan Shi, Beijing University of Posts and Telecommunications).

1/28/2021 Session: A Framework For Differentiable Discovery of Graph Algorithms (Dr. Le Song, Georgia Tech).
What does 2022 hold for Geometric & Graph ML?

Michael Bronstein and Petar Veličković released a huge post summarizing the state of Graph ML in 2021 and predicting possible breakthroughs in 2022. Even more, the authors conducted a large-scale community study and interviewed many prominent researchers discussing 11 key aspects:

- Rising importance of geometry in ML
- Message passing GNNs are still dominating
- Differential equations power new GNN architectures
- Old ideas from signal processing, neuroscience, and physics strike back
- Modeling complex systems with higher-order structures
- Reasoning, axiomatisation, and generalisation are still challenging
- Graphs in Reinforcement Learning
- AlphaFold 2 is a paradigm shift in structural biology
- Progress in graph transformer architectures
- Drug discovery with Geometric and Graph ML
- Quantum ML + graph-based methods

And 130 references 😉
Postdoc position at Univerity Milano-Bicocca with Dimitri Ognibene

University Milano-Bicocca is offering 1 postdoctoral position on machine learning and AI applied to understand and contrast social media threats for teenagers. The successful participant will be involved in the multidisciplinary project COURAGE funded by Volkswagen Foundation.

The position is research only starting in April 2022

Application Closes: T.B.D. (about mid-February)

Starts: April - May 2022

Duration: 1.5 years

Salary: € 3100 pm after tax

Please contact us for expression of interest and preliminary information:

dimitri.ognibene@unimib.it (PI)

f.lomonaco5@campus.unimib.it

Topics: Social media, NLP, machine learning, cognitive models, opinion dynamics, computer vision, reinforcement learning, graph neural network, recommender systems, network analysis

Project: COURAGE

Informal Denoscription: https://sites.google.com/site/dimitriognibenehomepage/jobs
Job openings - ML for molecule design at Roche, Zurich

The Swiss ML group within Prescient Design/Roche is looking for (graph) ML and (computational/structural) biology (BIO) researchers. Apply if you want to use your neural network skills to design macromolecules & help improve lives.

More information: bit.ly/3HbqQ0w (ML scientist), bit.ly/3BG0ra2 (computational biologist)

Team: www.prescient.design, a member of the Roche group
Location: Zurich/Switzerland
Relevant work: bit.ly/36mimae
Timing: asap
Contact: andreas.loukas[at]roche.com

Keywords: machine learning, graph neural networks, geometric deep learning, generative models, invariant/equivariant neural nets, reinforcement learning, protein design, rosetta
New Release of DGL v0.8

One of the most prominent libraries for graph machine learning, DGL, has recently released an updated v0.8 with new features as well as improvement on system performance. The highlights are:

- A major update of the mini-batch sampling pipeline.
- Significant acceleration and code simplification of heterogeneous GNNs and link-prediction models.
- GNNLens: a DGL empowered tool for GNN explanability.
- New functions to create, transform and augment graph datasets, e.g. in graph contrastive learning or repurposing a graph for different tasks.
- DGL-Go: a new GNN model training command line tool for quick experiments with SOTA models.

https://www.dgl.ai/release/2022/03/01/release.html
Recent Advances in Efficient and Scalable Graph Neural Networks

A new research blogpost by Chaitanya K. Joshi overviewing the toolbox for Graph Neural Networks to scale to real-world graphs and real-time applications.

Training and deploying GNNs to handle real-world graph data poses several theoretical and engineering challenges:
1. Giant Graphs – Memory Limitations
2. Sparse Computations – Hardware Limitations
3. Graph Subsampling – Reliability Limitations

The blogpost introduces three simple but effective ideas in the 'toolbox' for developing efficient and scalable GNNs:
- Data Preparation - From sampling large-scale graphs to CPU-GPU hybrid training via historical node embedding lookups.
- Efficient Architectures - Graph-augmented MLPs for scaling to giant networks, and efficient graph convolution designs for real-time inference on batches of graph data.
- Learning Paradigms - Combining Quantization Aware Training (low precision model weights and activations) with Knowledge Distillation (improving efficient GNNs using expressive teacher models) for maximizing inference latency as well as performance.

Blogpost: https://www.chaitjo.com/post/efficient-gnns/
Learning on Graphs with Missing Node Features

A new paper and associated blogpost by Emanuele Rossi and Prof. Michael Bronstein.

Most Graph Neural Networks typically run under the assumption of a full set of features available for all nodes. In real-world scenarios features are often only partially available (for example, in social networks, age and gender can be known only for a small subset of users). Feature Propagation is an efficient and scalable approach for handling missing features in graph machine learning applications that works surprisingly well despite its simplicity.

📝 Blog Post: https://bit.ly/3ILn1Rl
💻 Code: https://bit.ly/3J9ftbr
🎥 Recording: https://bit.ly/3CbBvHW
📖 Slides: https://bit.ly/3Mh5geW
📜 Paper: https://bit.ly/3Kgo4JE
The Exact Class of Graph Functions Generated by Graph Neural Networks by Mohammad Fereydounian (UPenn), Hamed Hassani (UPenn), Javid Dadashkarimi (Yale), and Amin Karbasi (Yale).

ArXiv: https://arxiv.org/abs/2202.08833

A recent pre-print discussing the connections between Graph Neural Networks (GNNs) and Dynamic Programming (DP).

The paper asks: Given a graph function, defined on an arbitrary set of edge weights and node features, is there a GNN whose output is identical to the graph function?

They show that many graph problems, e.g. min-cut value, max-flow value, and max-clique size, can be represented by a GNN. Additionally, there exist simple graphs for which no GNN can correctly find the length of the shortest paths between all nodes (a classic DP problem).

The paper's main claim is that this negative example shows that DP and GNNs are misaligned, even though (conceptually) they follow very similar iterative procedures.

This claim has been hotly debated by Graph ML Twitter, with many interesting perspectives, e.g. see the original Tweet and subsequent discussions by L. Cotta and P. Veličković.