Graph Databases Blog Posts
4 blog posts exploring different ideas behind graph databases:
* Graph Fundamentals — Part 1: RDF
* Graph Fundamentals — Part 2: Labelled Property Graphs
* Graph Fundamentals — Part 3: Graph Schema Languages
* Graph Fundamentals — Part 4: Linked Data
4 blog posts exploring different ideas behind graph databases:
* Graph Fundamentals — Part 1: RDF
* Graph Fundamentals — Part 2: Labelled Property Graphs
* Graph Fundamentals — Part 3: Graph Schema Languages
* Graph Fundamentals — Part 4: Linked Data
Medium
Graph Fundamentals — Part 1: RDF
Graph databases are on the rise, but amid all the hype it can be hard to understand the differences under the hood. This is the first…
Graph papers and reviews at ICLR 2022
Here is a list of reviews on graph papers at ICLR 2022. Three papers received the average score of 8. For more stats refer here.
Here is a list of reviews on graph papers at ICLR 2022. Three papers received the average score of 8. For more stats refer here.
Google Docs
iclr2022_graph_papers
iclr2022_graph_papers
noscript,keywords,url,avg_rating,ratings
Iterative Refinement Graph Neural Network for Antibody Sequence-Structure Co-design,['Drug Discovery', 'Antibody Design', 'Generative Models', 'Graph Generation'],https://openreview.net/forum?id=LI2bhrE_2A…
noscript,keywords,url,avg_rating,ratings
Iterative Refinement Graph Neural Network for Antibody Sequence-Structure Co-design,['Drug Discovery', 'Antibody Design', 'Generative Models', 'Graph Generation'],https://openreview.net/forum?id=LI2bhrE_2A…
COLLOQUIUM PRAIRIE
A colloquium on AI, which among others talks about graph ML. Next talk will be on "Exploiting Graph Invariants in Deep Learning" by Marc Lelarge (Inria).
A colloquium on AI, which among others talks about graph ML. Next talk will be on "Exploiting Graph Invariants in Deep Learning" by Marc Lelarge (Inria).
Prairie - PaRis Artificial Intelligence Research InstitutE
Colloquium PRAIRIE - Prairie
To receive PRAIRIE news and colloquium announcements sign up for PRAIRIE mailing list. (In case this automatic link does not work, send an email to sympa@inria.fr with the subject: subscribe prairie_news [your e-mail address].) Connection link: https://u…
From Mila with 💌 and graphs
A prolific week for Mila researchers:
- Michael Galkin released a new review of Knowledge Graph papers from EMNLP 2021. For those of us who didn't make it to Dominical Republic, you can experience the premium Punta Cana content about applications of graphs in language modeling, KG construction, entity linking, and question answering.
- Best Long Paper award at EMNLP 2021 went to Visually Grounded Reasoning across Languages and Cultures by the team from Cambridge, Copenhagen, and Mila
Mila and Mila-affiliated folks run a good bunch of reading groups you might find useful: in addition to the GRL Reading Group and LoGaG Reading group, there exist ones on Neural AI, Out-of-Distribution Generalization, Quantum & AI , ML4Code
A prolific week for Mila researchers:
- Michael Galkin released a new review of Knowledge Graph papers from EMNLP 2021. For those of us who didn't make it to Dominical Republic, you can experience the premium Punta Cana content about applications of graphs in language modeling, KG construction, entity linking, and question answering.
- Best Long Paper award at EMNLP 2021 went to Visually Grounded Reasoning across Languages and Cultures by the team from Cambridge, Copenhagen, and Mila
Mila and Mila-affiliated folks run a good bunch of reading groups you might find useful: in addition to the GRL Reading Group and LoGaG Reading group, there exist ones on Neural AI, Out-of-Distribution Generalization, Quantum & AI , ML4Code
Medium
Knowledge Graphs @ EMNLP 2021
Your regular digest of KG research, November edition
Fresh picks from ArXiv
This week on ArXiv: knowledge distillation, robustness benchmark, and SVD instead of learning ✒️
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* On Representation Knowledge Distillation for Graph Neural Networks
* Can Graph Neural Networks Learn to Solve MaxSAT Problem?
* DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks NeurIPS 2021
* An Interpretable Graph Generative Model with Heterophily
* Convolutional Neural Network Dynamics: A Graph Perspective with Danai Koutra
Benchmarks
* Graph Robustness Benchmark: Benchmarking the Adversarial Robustness of Graph Machine Learning
GRL
* Implicit SVD for Graph Representation Learning NeurIPS 2021
* Multi-task Learning of Order-Consistent Causal Graphs with Le Song
* Adversarial Attacks on Graph Classification via Bayesian Optimisation
* Conditional Attention Networks for Distilling Knowledge Graphs in Recommendation CIKM 2021
This week on ArXiv: knowledge distillation, robustness benchmark, and SVD instead of learning ✒️
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* On Representation Knowledge Distillation for Graph Neural Networks
* Can Graph Neural Networks Learn to Solve MaxSAT Problem?
* DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks NeurIPS 2021
* An Interpretable Graph Generative Model with Heterophily
* Convolutional Neural Network Dynamics: A Graph Perspective with Danai Koutra
Benchmarks
* Graph Robustness Benchmark: Benchmarking the Adversarial Robustness of Graph Machine Learning
GRL
* Implicit SVD for Graph Representation Learning NeurIPS 2021
* Multi-task Learning of Order-Consistent Causal Graphs with Le Song
* Adversarial Attacks on Graph Classification via Bayesian Optimisation
* Conditional Attention Networks for Distilling Knowledge Graphs in Recommendation CIKM 2021
Complex and Simple Models of Multidimensional Data : from graphs to neural networks
A mini-workshop on applications of graphs in biology. 1 December, free, but registration is mandatory.
A mini-workshop on applications of graphs in biology. 1 December, free, but registration is mandatory.
www.ihes.fr
Complex and simple models of multidimensional data : from graphs to neural networks
website denoscription
Introducing TensorFlow Graph Neural Networks
A new API for TF2 to build GNNs. It would be interesting to see how it compares to PyG and DGL libraries.
A new API for TF2 to build GNNs. It would be interesting to see how it compares to PyG and DGL libraries.
blog.tensorflow.org
Introducing TensorFlow Graph Neural Networks
Introducing TensorFlow GNN, a library to build Graph Neural Networks on the TensorFlow
platform.
platform.
Graph Neural Networks through the lens of Differential Geometry and Algebraic Topology
And Michael is back with a first post in the series of posts on the connection between GML and differential geometry and algebraic topology. We've been waiting for this!
And Michael is back with a first post in the series of posts on the connection between GML and differential geometry and algebraic topology. We've been waiting for this!
Medium
Graph Neural Networks through the lens of Differential Geometry and Algebraic Topology
New perspectives on old problems in Graph ML
Fresh picks from ArXiv
This week on ArXiv: generalization guarantees, explaining bio recommendations, and learning over cosmos data 🌕
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Generalizing Graph Neural Networks on Out-Of-Distribution Graphs
* Federated Social Recommendation with Graph Neural Network with Philip S. Yu
* Pre-training Graph Neural Network for Cross Domain Recommendation with Philip S. Yu
* Inferring halo masses with Graph Neural Networks
* Explainable Biomedical Recommendations via Reinforcement Learning Reasoning on Knowledge Graphs
Benchmark
* GRecX: An Efficient and Unified Benchmark for GNN-based Recommendation
Hardware
* QGTC: Accelerating Quantized GNN via GPU Tensor Core
This week on ArXiv: generalization guarantees, explaining bio recommendations, and learning over cosmos data 🌕
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Generalizing Graph Neural Networks on Out-Of-Distribution Graphs
* Federated Social Recommendation with Graph Neural Network with Philip S. Yu
* Pre-training Graph Neural Network for Cross Domain Recommendation with Philip S. Yu
* Inferring halo masses with Graph Neural Networks
* Explainable Biomedical Recommendations via Reinforcement Learning Reasoning on Knowledge Graphs
Benchmark
* GRecX: An Efficient and Unified Benchmark for GNN-based Recommendation
Hardware
* QGTC: Accelerating Quantized GNN via GPU Tensor Core
Successful Phase I Cancer Vaccine Trials Powered by Graph ML
Transgene and NEC Corporation published a press release on the successful Phase I trials of TG4050, neoantigen cancer vaccine, tested on ovarian cancer, head and neck cancer. The release outlines the NEC's Neoantigen Prediction System based on Graph ML algorithms.
We reached out to Mathias Niepert, Chief Research Scientist of NEC Labs, to shed a bit more light on the graph ml setup and he kindly provided a few interesting details! Mathias says:
The main graph ML method is derived from Embedding Propagation which is a GNN that’s trained in an unsupervised way and, crucially, is able to handle / impute missing data in embedding space. The most relevant papers are Learning Graph Representations with Embedding Propagation (NeurIPS 2017) and Learning Representations of Missing Data for Predicting Patient Outcomes
A major challenge is that for each neoantigen we have some measurements but not all. Obtaining some of these requires expensive tests and some have to be collected from previous biomedical studies. One ends up with several very different feature types (requiring different ML encoders) and, for each such feature type, we only sometimes have a value. The graph-based ML method helps to impute and learn a unifying embedding space.
The graph itself is created based on specific similarity measures between proteins and is not given a-priori. Having a general graph, the task is to rank peptide candidates which would be most efficient for a given patient.
From a probe of a patient's cancer and healthy cells, you get several tens of thousands of neoantigen candidates. To manufacture a personalized vaccine, you have to narrow this down to several dozen candidates. These candidates should have two properties (1) likelihood to elicit immune response (2) different from healthy cell antigen. You end up scoring the neoantigens with the ML method, take the top K, and based on these you synthesize the vaccine. Graph ML is one component of a pretty complex system.
Mathias would like to emphasize that this is based on the work of several people at NEC and most credit should go to the domain experts who have collected the data and adapted and applied the graph ML methods to this problem.
Transgene and NEC Corporation published a press release on the successful Phase I trials of TG4050, neoantigen cancer vaccine, tested on ovarian cancer, head and neck cancer. The release outlines the NEC's Neoantigen Prediction System based on Graph ML algorithms.
We reached out to Mathias Niepert, Chief Research Scientist of NEC Labs, to shed a bit more light on the graph ml setup and he kindly provided a few interesting details! Mathias says:
The main graph ML method is derived from Embedding Propagation which is a GNN that’s trained in an unsupervised way and, crucially, is able to handle / impute missing data in embedding space. The most relevant papers are Learning Graph Representations with Embedding Propagation (NeurIPS 2017) and Learning Representations of Missing Data for Predicting Patient Outcomes
A major challenge is that for each neoantigen we have some measurements but not all. Obtaining some of these requires expensive tests and some have to be collected from previous biomedical studies. One ends up with several very different feature types (requiring different ML encoders) and, for each such feature type, we only sometimes have a value. The graph-based ML method helps to impute and learn a unifying embedding space.
The graph itself is created based on specific similarity measures between proteins and is not given a-priori. Having a general graph, the task is to rank peptide candidates which would be most efficient for a given patient.
From a probe of a patient's cancer and healthy cells, you get several tens of thousands of neoantigen candidates. To manufacture a personalized vaccine, you have to narrow this down to several dozen candidates. These candidates should have two properties (1) likelihood to elicit immune response (2) different from healthy cell antigen. You end up scoring the neoantigens with the ML method, take the top K, and based on these you synthesize the vaccine. Graph ML is one component of a pretty complex system.
Mathias would like to emphasize that this is based on the work of several people at NEC and most credit should go to the domain experts who have collected the data and adapted and applied the graph ML methods to this problem.
NEC
Transgene and NEC announce positive preliminary data from Phase I studies of TG4050, a novel individualized neoantigen cancer vaccine
Positive initial data generated in the first six patients treated with TG4050 demonstrate the strong potential of this individualized immunotherapy in ovarian cancer and head and neck cancer
Fresh picks from ArXiv
This week on ArXiv: feature propagation to alleviate missing node features, new sota for molecular prediction, and benchmarks on GNN explanations 👴
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Multi-fidelity Stability for Graph Representation Learning with Joan Bruna
* AutoHEnsGNN: Winning Solution to AutoGraph Challenge for KDD Cup 2020
* Demystifying Graph Neural Network Explanations
* Unsupervised Learning for Identifying High Eigenvector Centrality Nodes: A Graph Neural Network Approach
* On the Unreasonable Effectiveness of Feature propagation in Learning on Graphs with Missing Node Features with Michael Bronstein
* Directional Message Passing on Molecular Graphs via Synthetic Coordinates with Stephan Günnemann
This week on ArXiv: feature propagation to alleviate missing node features, new sota for molecular prediction, and benchmarks on GNN explanations 👴
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Multi-fidelity Stability for Graph Representation Learning with Joan Bruna
* AutoHEnsGNN: Winning Solution to AutoGraph Challenge for KDD Cup 2020
* Demystifying Graph Neural Network Explanations
* Unsupervised Learning for Identifying High Eigenvector Centrality Nodes: A Graph Neural Network Approach
* On the Unreasonable Effectiveness of Feature propagation in Learning on Graphs with Missing Node Features with Michael Bronstein
* Directional Message Passing on Molecular Graphs via Synthetic Coordinates with Stephan Günnemann
Over-squashing, Bottlenecks, and Graph Ricci curvature
A second post by Michael Bronstein about over-squashing effect, when exponentially many neighbors aggregate their information into a fix-sized vector, causing the loss of the information. In this post, Michael connects over-squashing effect with Ricci flow — a well-studied notion of the space curvature that is used in differential geometry.
A second post by Michael Bronstein about over-squashing effect, when exponentially many neighbors aggregate their information into a fix-sized vector, causing the loss of the information. In this post, Michael connects over-squashing effect with Ricci flow — a well-studied notion of the space curvature that is used in differential geometry.
Medium
Over-squashing, Bottlenecks, and Graph Ricci curvature
A concept from differential geometry called Ricci curvature allows to understand the phenomena of over-squashing and bottlenecks in GNNs
Connected Data World 2021
Today at 14-00 (Paris time), I will be in the panel at Connected Data World conference, with a great line of graph ML researchers to talk about applications of GNNs. Besides there are many other interesting talks talking about ML, knowledge graphs, graph databases, and more. If you want to attend, you need to register (there is a free streaming track and also a discount code CDW21SPEAKERA20 for a full program).
Today at 14-00 (Paris time), I will be in the panel at Connected Data World conference, with a great line of graph ML researchers to talk about applications of GNNs. Besides there are many other interesting talks talking about ML, knowledge graphs, graph databases, and more. If you want to attend, you need to register (there is a free streaming track and also a discount code CDW21SPEAKERA20 for a full program).
Connected Data World Conference | December 1-2-3rd 2021
Connected Data World 2021. Connecting Data, People, & Ideas since 2016.
Mathematical discoveries take intuition and creativity – and now a little help from AI
A new work published in Nature with Petar Veličković and his colleagues shows how GNNs can guide the intuition of mathematicians in both representation theory & knot theory. In a more mathematical manunoscript they prove a combinatorial conjecture from 80s with the help of GNNs. This article also describes their discovery.
A new work published in Nature with Petar Veličković and his colleagues shows how GNNs can guide the intuition of mathematicians in both representation theory & knot theory. In a more mathematical manunoscript they prove a combinatorial conjecture from 80s with the help of GNNs. This article also describes their discovery.
Nature
Advancing mathematics by guiding human intuition with AI
Nature - A framework through which machine learning can guide mathematicians in discovering new conjectures and theorems is presented and shown to yield mathematical insight on important open...
Tutorial: Message Passing In Machine Learning
At NeurIPS 2021, there is a tutorial Message Passing In Machine Learning by Wee Sun Lee. It discusses in depth probabilistic graphical models (belief propagation, variational inference, mean field), markov decision processes (value iteration networks), GNNs and attention networks. It will be live today for registered people. Slides are available online.
At NeurIPS 2021, there is a tutorial Message Passing In Machine Learning by Wee Sun Lee. It discusses in depth probabilistic graphical models (belief propagation, variational inference, mean field), markov decision processes (value iteration networks), GNNs and attention networks. It will be live today for registered people. Slides are available online.
neurips.cc
NeurIPS 2021 Schedule
NeurIPS Website
Fresh picks from ArXiv
This week on ArXiv: library for graph recommendations, embeddings for molecules, and self-explaining GNNs 🦜
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Fast Graph Neural Tangent Kernel via Kronecker Sketching AAAI 22
* ProtGNN: Towards Self-Explaining Graph Neural Networks AAAI 22
* Graph4Rec: A Universal Toolkit with Graph Neural Networks for Recommender Systems
* Imbalanced Graph Classification via Graph-of-Graph Neural Networks
* Molecular Contrastive Learning with Chemical Element Knowledge Graph AAAI 22
* Learning Large-Time-Step Molecular Dynamics with Graph Neural Networks
This week on ArXiv: library for graph recommendations, embeddings for molecules, and self-explaining GNNs 🦜
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Fast Graph Neural Tangent Kernel via Kronecker Sketching AAAI 22
* ProtGNN: Towards Self-Explaining Graph Neural Networks AAAI 22
* Graph4Rec: A Universal Toolkit with Graph Neural Networks for Recommender Systems
* Imbalanced Graph Classification via Graph-of-Graph Neural Networks
* Molecular Contrastive Learning with Chemical Element Knowledge Graph AAAI 22
* Learning Large-Time-Step Molecular Dynamics with Graph Neural Networks
Open PhD/PostDoc Positions on GML
Aleksandar Bojchevski, whom you may know by his works on robustness, adversarial attacks, generative models on graphs, hires PhDs and PostDocs to work on trustworthy machine learning research. The position is at CISPA Helmholtz Center for Information Security, in Saarland, Germany and is well paid (for PhDs around 4K euros per month before taxes). Apply here.
Aleksandar Bojchevski, whom you may know by his works on robustness, adversarial attacks, generative models on graphs, hires PhDs and PostDocs to work on trustworthy machine learning research. The position is at CISPA Helmholtz Center for Information Security, in Saarland, Germany and is well paid (for PhDs around 4K euros per month before taxes). Apply here.
abojchevski.github.io
Aleksandar Bojchevski
Full Professor @ University of Cologne
Fresh picks from ArXiv
This week on ArXiv: belief propagation guarantees, scaling molecular GNNs, and debunking multi-label graph embeddings 🧙
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNN
* A Comparative Study on Robust Graph Neural Networks to Structural Noises
* Convergence of Generalized Belief Propagation Algorithm on Graphs with Motifs
* LUNAR: Unifying Local Outlier Detection Methods via Graph Neural Networks AAAI 2022
* A Self-supervised Mixed-curvature Graph Neural Network AAAI 2022
* Robustification of Online Graph Exploration Methods AAAI 2022
* Neural Belief Propagation for Scene Graph Generation
* Adaptive Kernel Graph Neural Network AAAI 2022
* On the Use of Unrealistic Predictions in Hundreds of Papers Evaluating Graph Representations AAAI 2022
* Scalable Geometric Deep Learning on Molecular Graphs
* OOD-GNN: Out-of-Distribution Generalized Graph Neural Network
This week on ArXiv: belief propagation guarantees, scaling molecular GNNs, and debunking multi-label graph embeddings 🧙
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNN
* A Comparative Study on Robust Graph Neural Networks to Structural Noises
* Convergence of Generalized Belief Propagation Algorithm on Graphs with Motifs
* LUNAR: Unifying Local Outlier Detection Methods via Graph Neural Networks AAAI 2022
* A Self-supervised Mixed-curvature Graph Neural Network AAAI 2022
* Robustification of Online Graph Exploration Methods AAAI 2022
* Neural Belief Propagation for Scene Graph Generation
* Adaptive Kernel Graph Neural Network AAAI 2022
* On the Use of Unrealistic Predictions in Hundreds of Papers Evaluating Graph Representations AAAI 2022
* Scalable Geometric Deep Learning on Molecular Graphs
* OOD-GNN: Out-of-Distribution Generalized Graph Neural Network
PhD positions at the University of Vienna
Nils M. Kriege, whom you may know from his works on graph kernels, TUDatasets, Weisfeiler-Leman embeddings and other important line of research in GML, now has two PhD positions at the University of Vienna. You can read more about these positions here and here. Application deadline December 20th. You can apply here.
Nils M. Kriege, whom you may know from his works on graph kernels, TUDatasets, Weisfeiler-Leman embeddings and other important line of research in GML, now has two PhD positions at the University of Vienna. You can read more about these positions here and here. Application deadline December 20th. You can apply here.
New Deep Learning Model Could Accelerate the Process of Discovering New Medicines
An article on recent GeoMol model that generates new molecules in an end-to-end, non-autoregressive and SE(3)-invariant fashion.
An article on recent GeoMol model that generates new molecules in an end-to-end, non-autoregressive and SE(3)-invariant fashion.
SciTechDaily
New Deep Learning Model Could Accelerate the Process of Discovering New Medicines
Taking Some of the Guesswork Out of Drug Discovery A deep learning model rapidly predicts the 3D shapes of drug-like molecules, which could accelerate the process of discovering new medicines. In their quest to discover effective new medicines, scientists…
Machine Learning and Simulation Science (MLSim) Group Positions
Mathias Niepert, whom you may know by the recent breakthrough that applies GML for cancer drug discovery, now hires PhDs and Postdocs at the University of Stuttgart. The application deadline is January 15th 2022. Find out more here.
Mathias Niepert, whom you may know by the recent breakthrough that applies GML for cancer drug discovery, now hires PhDs and Postdocs at the University of Stuttgart. The application deadline is January 15th 2022. Find out more here.
Telegram
Graph Machine Learning
Successful Phase I Cancer Vaccine Trials Powered by Graph ML
Transgene and NEC Corporation published a press release on the successful Phase I trials of TG4050, neoantigen cancer vaccine, tested on ovarian cancer, head and neck cancer. The release outlines…
Transgene and NEC Corporation published a press release on the successful Phase I trials of TG4050, neoantigen cancer vaccine, tested on ovarian cancer, head and neck cancer. The release outlines…