OGB Large-Scale Challenge Workshop - Presentations of the Winners
OGB LSC is a KDD'21 challenge organized by the OGB team and known for the largest-to-date benchmarking datasets in node-level (240M nodes / 1.7B edges), link-level (90M nodes, 500M edges), and graph-level (4M molecules) tasks. Surely, not all academic labs can afford such compute, but the more interesting are the approaches taken by the winners! Are there any smart tricks or merely "more layers - more ensembles - GPUs go brrr"?
Finally, the recordings of the LSC workshop are available! (~3 hours long, so the Graph ML channel editors assume you've already successfully digested the ML Street Talk for breakfast)
The 2nd day of the workshop features (videos are available):
- Invited talks by Viktor Prasanna (USC), Marinka Zitnik (Harvard), and Larry Zitnick (Facebook AI)
- Panel discussion on the future of Graph ML with Yizhou Sun (UCLA), Zheng Zhang (NYU / Amazon), Shuiwang Ji (Texas A&M), and Jian Tang (MILA)
OGB LSC is a KDD'21 challenge organized by the OGB team and known for the largest-to-date benchmarking datasets in node-level (240M nodes / 1.7B edges), link-level (90M nodes, 500M edges), and graph-level (4M molecules) tasks. Surely, not all academic labs can afford such compute, but the more interesting are the approaches taken by the winners! Are there any smart tricks or merely "more layers - more ensembles - GPUs go brrr"?
Finally, the recordings of the LSC workshop are available! (~3 hours long, so the Graph ML channel editors assume you've already successfully digested the ML Street Talk for breakfast)
The 2nd day of the workshop features (videos are available):
- Invited talks by Viktor Prasanna (USC), Marinka Zitnik (Harvard), and Larry Zitnick (Facebook AI)
- Panel discussion on the future of Graph ML with Yizhou Sun (UCLA), Zheng Zhang (NYU / Amazon), Shuiwang Ji (Texas A&M), and Jian Tang (MILA)
Open Graph Benchmark
OGB-LSC @ KDD Cup 2021
Learn about the workshop schedule All the sessions are live talks over Zoom. You need to register for the KDD conference in order to join the event.
GML Express: Graph ML in Industry Workshop, Geometric Deep Learning, and New Software.
In case you missed recent most popular events in graph ML, here is a fresh newsletter with recent videos, courses, books, trends, and future events.
In case you missed recent most popular events in graph ML, here is a fresh newsletter with recent videos, courses, books, trends, and future events.
Graph Machine Learning
GML Express: Graph ML in Industry Workshop, Geometric Deep Learning, and New Software.
"The real voyage of discovery consists not in seeking new lands but seeing with new eyes." Marcel Proust
Graph Machine Learning in Industry workshop live
Our workshop starts in one hour and I'm excited about our speakers and talks that are ahead (something I would like to attend even if I didn't organize it). You can join us on YouTube or Zoom and we encourage you to ask questions.
The topics are:
0. Me (17:00 Paris time): opening remarks
1. James Zhang (AWS) (17:15): Challenges and Thinking in Go-production of GNN + DGL.
2. Charles Tapley Hoyt (Harvard) (17:45): Current Issues in Theory, Reproducibility, and Utility of Graph Machine Learning in the Life Sciences.
3. Anton Tsitsulin (Google) (18:15): Graph Learning for Billion Scale Graphs.
4. Cheng Ye (AstraZeneca) (19:00): Predicting Potential Drug Targets Using Tensor Factorisation and Knowledge Graph Embeddings.
5: Rocío Mercado (MIT) (19:30): Accelerating Molecular Design Using Graph-Based Deep Generative Models.
6. Lingfei Wu (JD.com) (20:00): Deep Learning On Graphs for Natural Language Processing.
Our workshop starts in one hour and I'm excited about our speakers and talks that are ahead (something I would like to attend even if I didn't organize it). You can join us on YouTube or Zoom and we encourage you to ask questions.
The topics are:
0. Me (17:00 Paris time): opening remarks
1. James Zhang (AWS) (17:15): Challenges and Thinking in Go-production of GNN + DGL.
2. Charles Tapley Hoyt (Harvard) (17:45): Current Issues in Theory, Reproducibility, and Utility of Graph Machine Learning in the Life Sciences.
3. Anton Tsitsulin (Google) (18:15): Graph Learning for Billion Scale Graphs.
4. Cheng Ye (AstraZeneca) (19:00): Predicting Potential Drug Targets Using Tensor Factorisation and Knowledge Graph Embeddings.
5: Rocío Mercado (MIT) (19:30): Accelerating Molecular Design Using Graph-Based Deep Generative Models.
6. Lingfei Wu (JD.com) (20:00): Deep Learning On Graphs for Natural Language Processing.
Kite: An interactive visualization tool for graph theory
Another tool called Kite to draw simple graphs and run some graph algorithms.
Another tool called Kite to draw simple graphs and run some graph algorithms.
GitHub
GitHub - erkal/kite: An interactive visualization tool for graph theory
An interactive visualization tool for graph theory - erkal/kite
Fresh picks from ArXiv
This week on ArXiv: generalization of graph embeddings, approximate message passing, and GNNs for hadron collider 🚇
If I forgot to mention your paper, please shoot me a message and I will update the post.
Knowledge graphs
* How Does Knowledge Graph Embedding Extrapolate to Unseen Data: a Semantic Evidence View
GNNs
* Graph-based Approximate Message Passing Iterations
* Orthogonal Graph Neural Networks
* Learning General Optimal Policies with Graph Neural Networks: Expressive Power, Transparency, and Limits
Applications
* Hybrid Quantum Classical Graph Neural Networks for Particle Track Reconstruction
* GeomGCL: Geometric Graph Contrastive Learning for Molecular Property Prediction
This week on ArXiv: generalization of graph embeddings, approximate message passing, and GNNs for hadron collider 🚇
If I forgot to mention your paper, please shoot me a message and I will update the post.
Knowledge graphs
* How Does Knowledge Graph Embedding Extrapolate to Unseen Data: a Semantic Evidence View
GNNs
* Graph-based Approximate Message Passing Iterations
* Orthogonal Graph Neural Networks
* Learning General Optimal Policies with Graph Neural Networks: Expressive Power, Transparency, and Limits
Applications
* Hybrid Quantum Classical Graph Neural Networks for Particle Track Reconstruction
* GeomGCL: Geometric Graph Contrastive Learning for Molecular Property Prediction
Feed-forward neural networks for graph processing: video
In this video, Charu Aggarwal discusses the simplest approach of using feedforward neural networks for graph processing. Much simpler than convolutional neural networks, they can do surprisingly well for creating node representations. The presentation is closely related to node2vec, but simplifies the presentation in many respects.
In this video, Charu Aggarwal discusses the simplest approach of using feedforward neural networks for graph processing. Much simpler than convolutional neural networks, they can do surprisingly well for creating node representations. The presentation is closely related to node2vec, but simplifies the presentation in many respects.
YouTube
Feed-forward neural networks for graph processing
This video discusses the simplest approach of using feedforward neural networks for graph processing. Much simpler than convolutional neural networks, they can do surprisingly well for creating node representations. The presentation is closely related to…
Graph Neural Networks for Point Cloud Processing: meeting
An online talk on 4th October by Mahdi Saleh on their recent work Graphite: GRAPH-Induced feaTure Extraction for Point Cloud Registration. In this presentation he discusses how graphs can be utilized to describe point cloud patches, detect salient points and use them in downstream tasks such as 3D registration.
An online talk on 4th October by Mahdi Saleh on their recent work Graphite: GRAPH-Induced feaTure Extraction for Point Cloud Registration. In this presentation he discusses how graphs can be utilized to describe point cloud patches, detect salient points and use them in downstream tasks such as 3D registration.
Scalable Algorithms for Semi-supervised and Unsupervised Learning
A great event coming from Google, Oct 5-7 on unsupervised learning, which includes many great speakers from graph community (Andreas Krause, Piotr Indyk, David Woodruff, David Gleich, Stefanie Jegelka, Leman Akoglu, Danai Koutra, Andreas Loukas, Marinka Zitnik, and many others).
A great event coming from Google, Oct 5-7 on unsupervised learning, which includes many great speakers from graph community (Andreas Krause, Piotr Indyk, David Woodruff, David Gleich, Stefanie Jegelka, Leman Akoglu, Danai Koutra, Andreas Loukas, Marinka Zitnik, and many others).
Withgoogle
Scalable Algorithms for Semi-supervised and Unsupervised Learning - Home
Fresh picks from ArXiv
This week on ArXiv: reconstruction conjecture for higher expressivity, decision graphs, and control in robots 🤖
If I forgot to mention your paper, please shoot me a message and I will update the post.
NeurIPS
* Motif-based Graph Self-Supervised Learning for Molecular Property Prediction NeurIPS 2021
* Reconstruction for Powerful Graph Representations NeurIPS 2021
* Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration NeurIPS 2021
GNNs
* Graph Pointer Neural Networks
* Equivariant Neural Network for Factor Graphs
* Tree in Tree: from Decision Trees to Decision Graphs
Applications
* Deep Fraud Detection on Non-attributed Graph
* How Neural Processes Improve Graph Link Prediction
* Coverage Control in Multi-Robot Systems via Graph Neural Networks
* Molecule3D: A Benchmark for Predicting 3D Geometries from Molecular Graphs
This week on ArXiv: reconstruction conjecture for higher expressivity, decision graphs, and control in robots 🤖
If I forgot to mention your paper, please shoot me a message and I will update the post.
NeurIPS
* Motif-based Graph Self-Supervised Learning for Molecular Property Prediction NeurIPS 2021
* Reconstruction for Powerful Graph Representations NeurIPS 2021
* Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration NeurIPS 2021
GNNs
* Graph Pointer Neural Networks
* Equivariant Neural Network for Factor Graphs
* Tree in Tree: from Decision Trees to Decision Graphs
Applications
* Deep Fraud Detection on Non-attributed Graph
* How Neural Processes Improve Graph Link Prediction
* Coverage Control in Multi-Robot Systems via Graph Neural Networks
* Molecule3D: A Benchmark for Predicting 3D Geometries from Molecular Graphs
Graph Representation Learning Reading Group @ Mila
The more reading groups on Graph ML in different regions and timezones - the better!
This one is organized by Mila postdocs and open for participation via Zoom. RG starts this Thursday. The lineup for next weeks is published, check the website for more details.
The more reading groups on Graph ML in different regions and timezones - the better!
This one is organized by Mila postdocs and open for participation via Zoom. RG starts this Thursday. The lineup for next weeks is published, check the website for more details.
Graph Neural Network for Lagrangian Simulation: video
A presentation by Zijie Li (CMU) on modeling fluid dynamics with GNNs.
A presentation by Zijie Li (CMU) on modeling fluid dynamics with GNNs.
YouTube
Graph Neural Network for Lagrangian Simulation - Zijie Li
MAIL Website: http://baratilab.com
Presented at American Physical Society - Division of Fluid Dynamics Annual Meeting (APS-DFD 2020)
Fluid Simulations with Graph Neural Networks:
Water Fall: https://youtu.be/zZ1NuFZGgVE
Dam: https://youtu.be/NGpBanlouLI…
Presented at American Physical Society - Division of Fluid Dynamics Annual Meeting (APS-DFD 2020)
Fluid Simulations with Graph Neural Networks:
Water Fall: https://youtu.be/zZ1NuFZGgVE
Dam: https://youtu.be/NGpBanlouLI…
iclr2022_papers.xlsx
349.2 KB
ICLR 2022 Submissions
Attached is the list of all submissions for ICLR 2022. In total there are 3712 submissions, while there are ~270 graph papers. About 40% are resubmissions from previous conferences and 75% have first author as student.
Attached is the list of all submissions for ICLR 2022. In total there are 3712 submissions, while there are ~270 graph papers. About 40% are resubmissions from previous conferences and 75% have first author as student.
TorchDrug Workshop
Tomorrow, October 14th, Jian Tang (Mila) will conduct a workshop “TorchDrug: A powerful and flexible machine learning platform for drug discovery.” presenting a recently released new drug discovery library, TorchDrug] (already 450+ stars on GitHub). TorchDrug employs GNNs, KG embedding algorithms, custom CUDA kernels and all fresh advancements of Geometric DL. Participation is free, the workshop starts at 11am EDT (5pm EU time)
Tomorrow, October 14th, Jian Tang (Mila) will conduct a workshop “TorchDrug: A powerful and flexible machine learning platform for drug discovery.” presenting a recently released new drug discovery library, TorchDrug] (already 450+ stars on GitHub). TorchDrug employs GNNs, KG embedding algorithms, custom CUDA kernels and all fresh advancements of Geometric DL. Participation is free, the workshop starts at 11am EDT (5pm EU time)
Eventbrite
Atelier Code@santé / Code@health Workshops
Monday Theory: Reconstruction Conjecture + GRL 🏗
A few months back in this channel we discussed the Reconstruction Conjecture as one of the grand challenges still open in graph theory. In simple words, the conjecture says that two graphs are the same iff their decks (one-vertex-deleted subgraphs) are the same. Does it have something to do with GNNs, you ask? It certainly has!
A recently accepted NeurIPS'21 paper Reconstruction for Powerful Graph Representations is dedicated exactly to this connection. We invited the first authors, Leonardo Cotta and Christopher Morris, to explain below the main intuition and experimental results of this wonderful work.
Although GNNs are extremely popular right now, they have clear limitations. Their expressive power is limited by the 1-Weisfeiler-Leman algorithm, a simple heuristic for the graph isomorphism problem. For example, GNNs cannot approximate graph properties such as diameter, radius, girth, and subgraph counts, inspiring architectures based on the more powerful k-dimensional Weisfeiler-Leman algorithm (k-WL). However, such architectures do not scale to large graphs. Hence, it remains an open challenge to design more expressive architectures that scale to large graphs while generalizing to unseen areas.
In our work, we first show how the k-reconstruction of graphs—reconstruction from induced k-vertex subgraphs—induces a natural class of expressive GRL architectures for supervised learning with graphs, denoted k-Reconstruction Neural Networks. We then show how several existing works have their expressive power limited by k-reconstruction. Further, we show how the reconstruction conjecture’s insights lead to a provably most-expressive representation of graphs.
To make our models scalable, we propose k-Reconstruction GNNs, a general tool for boosting the expressive power and performance of GNNs with graph reconstruction. Theoretically, we characterize their expressive power showing that k-Reconstruction GNNs can distinguish graph classes that the 1-WL and 2-WL cannot, such as cycle graphs and strongly regular graphs, respectively. Further, to explain gains in real-world tasks, we show how reconstruction can act as a lower-variance estimator of the risk when the graph-generating distribution is invariant to vertex removals.
Empirically, we show that reconstruction enhances GNNs’ expressive power, making them solve multiple synthetic graph property tasks in the literature not solvable by the original GNN. On real-world datasets, we show that the increase in expressive power coupled with the lower-variance risk estimator boosts GNN’s performance up to 25%. Our theoretical and empirical results combined make another important connection between graph theory and GRL.
A few months back in this channel we discussed the Reconstruction Conjecture as one of the grand challenges still open in graph theory. In simple words, the conjecture says that two graphs are the same iff their decks (one-vertex-deleted subgraphs) are the same. Does it have something to do with GNNs, you ask? It certainly has!
A recently accepted NeurIPS'21 paper Reconstruction for Powerful Graph Representations is dedicated exactly to this connection. We invited the first authors, Leonardo Cotta and Christopher Morris, to explain below the main intuition and experimental results of this wonderful work.
Although GNNs are extremely popular right now, they have clear limitations. Their expressive power is limited by the 1-Weisfeiler-Leman algorithm, a simple heuristic for the graph isomorphism problem. For example, GNNs cannot approximate graph properties such as diameter, radius, girth, and subgraph counts, inspiring architectures based on the more powerful k-dimensional Weisfeiler-Leman algorithm (k-WL). However, such architectures do not scale to large graphs. Hence, it remains an open challenge to design more expressive architectures that scale to large graphs while generalizing to unseen areas.
In our work, we first show how the k-reconstruction of graphs—reconstruction from induced k-vertex subgraphs—induces a natural class of expressive GRL architectures for supervised learning with graphs, denoted k-Reconstruction Neural Networks. We then show how several existing works have their expressive power limited by k-reconstruction. Further, we show how the reconstruction conjecture’s insights lead to a provably most-expressive representation of graphs.
To make our models scalable, we propose k-Reconstruction GNNs, a general tool for boosting the expressive power and performance of GNNs with graph reconstruction. Theoretically, we characterize their expressive power showing that k-Reconstruction GNNs can distinguish graph classes that the 1-WL and 2-WL cannot, such as cycle graphs and strongly regular graphs, respectively. Further, to explain gains in real-world tasks, we show how reconstruction can act as a lower-variance estimator of the risk when the graph-generating distribution is invariant to vertex removals.
Empirically, we show that reconstruction enhances GNNs’ expressive power, making them solve multiple synthetic graph property tasks in the literature not solvable by the original GNN. On real-world datasets, we show that the increase in expressive power coupled with the lower-variance risk estimator boosts GNN’s performance up to 25%. Our theoretical and empirical results combined make another important connection between graph theory and GRL.
Telegram
Graph Machine Learning
The Easiest Unsolved Problem in Graph Theory
Our new blog post about reconstruction conjecture, a well-known graph theory problem with 80 years of results but no final proof yet. I have already written several posts in this channel about it and it to me…
Our new blog post about reconstruction conjecture, a well-known graph theory problem with 80 years of results but no final proof yet. I have already written several posts in this channel about it and it to me…
Fresh picks from ArXiv
This week on ArXiv: class-dependent generative models, label-feature propagations, and space-time GNNs 🚀
If I forgot to mention your paper, please shoot me a message and I will update the post.
Biology
* Pre-training Molecular Graph Representation with 3D Geometry
* Molecular Graph Generation via Geometric Scattering
* Iterative Refinement Graph Neural Network for Antibody Sequence-Structure Co-design with Regina Barzilay and Tommi Jaakkola
* CCGG: A Deep Autoregressive Model for Class-Conditional Graph Generation
GNNs
* Graph Neural Networks with Learnable Structural and Positional Representations with Yoshua Bengio and Xavier Bresson
* ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network with Philip S. Yu
* Training Stable Graph Neural Networks Through Constrained Learning with Alejandro Ribeiro
* Space-Time Graph Neural Networks with Alejandro Ribeiro
* Equivariant Subgraph Aggregation Networks with Michael M. Bronstein and Haggai Maron
KG
* A Survey on State-of-the-art Techniques for Knowledge Graphs Construction and Challenges ahead
LP
* Label-Wise Message Passing Graph Neural Network on Heterophilic Graphs
* Why Propagate Alone? Parallel Use of Labels and Features on Graphs
* Label Propagation across Graphs: Node Classification using Graph Neural Tangent Kernels
This week on ArXiv: class-dependent generative models, label-feature propagations, and space-time GNNs 🚀
If I forgot to mention your paper, please shoot me a message and I will update the post.
Biology
* Pre-training Molecular Graph Representation with 3D Geometry
* Molecular Graph Generation via Geometric Scattering
* Iterative Refinement Graph Neural Network for Antibody Sequence-Structure Co-design with Regina Barzilay and Tommi Jaakkola
* CCGG: A Deep Autoregressive Model for Class-Conditional Graph Generation
GNNs
* Graph Neural Networks with Learnable Structural and Positional Representations with Yoshua Bengio and Xavier Bresson
* ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network with Philip S. Yu
* Training Stable Graph Neural Networks Through Constrained Learning with Alejandro Ribeiro
* Space-Time Graph Neural Networks with Alejandro Ribeiro
* Equivariant Subgraph Aggregation Networks with Michael M. Bronstein and Haggai Maron
KG
* A Survey on State-of-the-art Techniques for Knowledge Graphs Construction and Challenges ahead
LP
* Label-Wise Message Passing Graph Neural Network on Heterophilic Graphs
* Why Propagate Alone? Parallel Use of Labels and Features on Graphs
* Label Propagation across Graphs: Node Classification using Graph Neural Tangent Kernels
iGDL 2021: Israeli Geometric Deep Learning Workshop
A strong and packed workshop on sets, 3d representation, meshes, and graphs organized by Israeli GDL community.
A strong and packed workshop on sets, 3d representation, meshes, and graphs organized by Israeli GDL community.
YouTube
iGDL 2021 Israeli Geometric Deep Learning Workshop
Exploring Complexity: How Graph Data Science is pushing new boundaries: panel
A panel on the graph data science is available for registration online. See their message below:
🔎 Exploring complexity is a challenge for all of us. The abundance of data still does not help us to make better decisions, we need to unbundle it, understand the context and find the latent relationships.
📣 Missioned to make this process simplified, as OpenCredo, we will be hosting a panel discussion with the leaders on Graph Data Science space to discuss the impacts of Graphs. Whether you are new to this space or listen to inspiring leaders of the community, this is a great opportunity!
- Dan McCreary, Distinguished engineer at Optum (he has great blog posts at [https://lnkd.in/eUFZMNh3])
- Paco Nathan, Evil Mad Scientists, a contributor to AI/ML/Graph space (some of his amazing work at [https://derwen.ai/report]
- Alessandro Negro, Chief Scientist at GraphAware (recently published Graph Powered Machine Learning [https://lnkd.in/eQycHze2])
💫 Our CTO/CEO Nicki Watt will be hosting the panel, and we are excited to have a great insight into how Graph is pushing the boundaries.
A panel on the graph data science is available for registration online. See their message below:
🔎 Exploring complexity is a challenge for all of us. The abundance of data still does not help us to make better decisions, we need to unbundle it, understand the context and find the latent relationships.
📣 Missioned to make this process simplified, as OpenCredo, we will be hosting a panel discussion with the leaders on Graph Data Science space to discuss the impacts of Graphs. Whether you are new to this space or listen to inspiring leaders of the community, this is a great opportunity!
- Dan McCreary, Distinguished engineer at Optum (he has great blog posts at [https://lnkd.in/eUFZMNh3])
- Paco Nathan, Evil Mad Scientists, a contributor to AI/ML/Graph space (some of his amazing work at [https://derwen.ai/report]
- Alessandro Negro, Chief Scientist at GraphAware (recently published Graph Powered Machine Learning [https://lnkd.in/eQycHze2])
💫 Our CTO/CEO Nicki Watt will be hosting the panel, and we are excited to have a great insight into how Graph is pushing the boundaries.
OpenCredo
Exploring Complexity - Graph Data Science Panel - OpenCredo
Join us on the 2nd of November for a great discussion on how Graph Data Science is pushing new boundaries.Reserve your spot now!
MICCAI 2021 graph papers
Here is a guest post by Mahsa Ghorbani about applications of graphs to medical images.
A few weeks ago, International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI) published the accepted list of papers and their reviews. About 20 of the papers are graph related which shows the impact of graph-based methods in medical applications. Here are two examples of papers:
- GKD: Semi-supervised Graph Knowledge Distillation for Graph-Independent Inference
Recently, graphs neural networks show great success in analyzing multi-modal medical data (such as demographic and imaging data) associated with a group of patients regarding the disease prediction problem. However, the conventional methods exhibit poor performance when the graph modality is not available during the inference time. GKD proposes a novel semi-supervised method inspired by the knowledge distillation framework to face this issue. The teacher block distills all the available information in training data and then transfers it to the student network trained with input features, not the filtered ones. Therefore, the student works well on the test data without the graph between them. GKD also utilizes a modified label-propagation algorithm in the teacher block to keep a balance between neighborhood features and node features.
- Early Detection of Liver Fibrosis Using Graph Convolutional Networks
Fibrosis refers to the deposition of collagen in tissue which can lead to organ dysfunction and even to organ failure. Typically, histochemical stains are being used to visualize collagen. Detection of early onset of fibrosis is critical to detecting long-term damage to identify potential loss of organ function. This paper uses a collagen segmentation method to extract a collagen map from an input histopathological image and then decompose it into a set of tiles. Then cluster the tiles and classify the clusters based on a few samples in them (visually). The tiles clustered as dense collagen are used as the centers, and each tile will be connected to the closest center (Voronoi tessellation). After a set of graph convolutional layers, an attention mechanism is used to aggregate the tile features and detect the fibrosis stage of the input image.
Here is a guest post by Mahsa Ghorbani about applications of graphs to medical images.
A few weeks ago, International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI) published the accepted list of papers and their reviews. About 20 of the papers are graph related which shows the impact of graph-based methods in medical applications. Here are two examples of papers:
- GKD: Semi-supervised Graph Knowledge Distillation for Graph-Independent Inference
Recently, graphs neural networks show great success in analyzing multi-modal medical data (such as demographic and imaging data) associated with a group of patients regarding the disease prediction problem. However, the conventional methods exhibit poor performance when the graph modality is not available during the inference time. GKD proposes a novel semi-supervised method inspired by the knowledge distillation framework to face this issue. The teacher block distills all the available information in training data and then transfers it to the student network trained with input features, not the filtered ones. Therefore, the student works well on the test data without the graph between them. GKD also utilizes a modified label-propagation algorithm in the teacher block to keep a balance between neighborhood features and node features.
- Early Detection of Liver Fibrosis Using Graph Convolutional Networks
Fibrosis refers to the deposition of collagen in tissue which can lead to organ dysfunction and even to organ failure. Typically, histochemical stains are being used to visualize collagen. Detection of early onset of fibrosis is critical to detecting long-term damage to identify potential loss of organ function. This paper uses a collagen segmentation method to extract a collagen map from an input histopathological image and then decompose it into a set of tiles. Then cluster the tiles and classify the clusters based on a few samples in them (visually). The tiles clustered as dense collagen are used as the centers, and each tile will be connected to the closest center (Voronoi tessellation). After a set of graph convolutional layers, an attention mechanism is used to aggregate the tile features and detect the fibrosis stage of the input image.
www.miccai2021.org
MICCAI 2021 - 24. International Conference On Medical Image Computing & Computer Assisted Intervention
MICCAI 2021, the 24
. International Conference on Medical Image Computing and Computer Assisted Intervention, will be held from September 27th to October 1st, 2021 in Strasbourg, FRANCE. MICCAI 2021 is organized in collaboration with University of Strasbourg.
. International Conference on Medical Image Computing and Computer Assisted Intervention, will be held from September 27th to October 1st, 2021 in Strasbourg, FRANCE. MICCAI 2021 is organized in collaboration with University of Strasbourg.
Probabilistic Symmetries and Invariant Neural Networks
A recent survey on neural networks that are invariant/equivariant under group actions (which GNNs are a special class of). Among others, it does a good job of significant works of 20th century that laid the foundation for invariant neural networks.
A recent survey on neural networks that are invariant/equivariant under group actions (which GNNs are a special class of). Among others, it does a good job of significant works of 20th century that laid the foundation for invariant neural networks.
Fresh picks from ArXiv
This week on ArXiv: GNNs for imbalanced case, relationships behind KGs, and software optimization of GNNs 💻
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Distance-wise Prototypical Graph Neural Network in Node Imbalance Classification with Tyler Derr
* Multi-view Contrastive Graph Clustering NeurIPS 2021
* Learning to Learn Graph Topologies NeurIPS 2021
Studies
* What is Learned in Knowledge Graph Embeddings?
* Understanding GNN Computational Graph: A Coordinated Computation, IO, and Memory Perspective
This week on ArXiv: GNNs for imbalanced case, relationships behind KGs, and software optimization of GNNs 💻
If I forgot to mention your paper, please shoot me a message and I will update the post.
GNNs
* Distance-wise Prototypical Graph Neural Network in Node Imbalance Classification with Tyler Derr
* Multi-view Contrastive Graph Clustering NeurIPS 2021
* Learning to Learn Graph Topologies NeurIPS 2021
Studies
* What is Learned in Knowledge Graph Embeddings?
* Understanding GNN Computational Graph: A Coordinated Computation, IO, and Memory Perspective
WSDM2022 Challenge from DGL Team
A really nice competition by DGL on temporal link prediction on two large-scale graph datasets. The dates are Oct 15 - Jan 20. Prize pool: $3500 + WSDM registration.
A really nice competition by DGL on temporal link prediction on two large-scale graph datasets. The dates are Oct 15 - Jan 20. Prize pool: $3500 + WSDM registration.
WSDM2022-Challenge
WSDM 2022 Challenge - Temporal Link Prediction
WSDM2022 Challenge - Large scale temporal graph link prediction