We show that message passing neural networks (MPNNs) are implicitly trained to respect graph functional distances, and introduce the weighted Weisfeiler Leman Labeling Tree (WILT) to identify subgraphs that MPNNs consider functionally important.
We show that message passing neural networks (MPNNs) are implicitly trained to respect graph functional distances, and introduce the weighted Weisfeiler Leman Labeling Tree (WILT) to identify subgraphs that MPNNs consider functionally important.
Thursday at ICML:
WILTing Trees: Interpreting the Distance Between MPNN Embeddings
openreview.net/forum?id=lfl...
Visit Masahiro at the Thursday afternoon poster session:
4:30 p.m. PDT โ 7 p.m. PDT
East Exhibition Hall A-B #E-2903
Community Takes on the Future of Graph Learning
Check out the outcomes of a very interesting discussion in our GLOW reading group.
glowreadinggroup.substack.com/p/community-...
Attended my first @netsciconf.bsky.social last week and it was amazing!
A highlight was the HONAI satellite that brought together the network science and machine learning communities.
...and who needs NeurIPS mugs if one can have NetSci toilet paper?
After a break in April we welcome all of you to the next session of GLOW๐ next week!
Join and interact with our speakers Christian Koke (scale in GNNs) and Yonatan Sverdlov (sparse geometric MPNNs).
๐๏ธ May 28th, 5pm CEST on Zoom.
๐ Details & sign-up: sites.google.com/view/graph-learning-on-weds.
๐ The date is out: 15/09/2025
Come meet the graph community in Porto @MLG to discuss the latest developments and ideas in the field!
๐กWe welcome many kinds of papers beyond regular ones such as Work-in-progress papers or Visionary (white) papers!
Submit by June 14th
mlg-europe.github.io/2025
Today, I gave a talk on Expressive Graph Representations via Homomorphisms in Subhankar Mishras lab at NISER, India.
For all those (all - 50) people who missed it at the LoG Paris meetup, there is now a recording available:
www.niser.ac.in/~smishra/eve...
Thanks for the invite!
GLOW is returning on ๐ ๐ฎ๐ฟ๐ฐ๐ต ๐ฎ๐ฒ๐๐ต, ๐ฑ๐ฝ๐บ ๐๐๐ง with a special guest: @petar-v.bsky.social ๐
He will lecture on LLMs as GNNs โ a topic which received quite some attention at our last session.
Specifically, we will learn how Graph ML tools can help understand LLM generalisation
I am hiring for a Ph.D. student to work in the areas of social network analysis, algorithms and fair machine learning.
Please apply and join our highly motivated team.
For more information please see the call: neumannstefan.com/hiring/
In two hours (i.e., 5pm CET), we'll have the first GLOW meeting of the year. Join us for two interesting talks and detailed discussions!
We organise a thematic day on Graph Machine Learning and Graph Neural Networks at IHP in Paris on March 31st. Please consider submitting abstracts!
It'll be fun ๐ค
Free mandatory registration:
gdr-iasis.cnrs.fr/reunions/app...
๐ GLOW 2025 kicks off with a great session in January!
Join and interact with our speakers Clayton Sanford (on transformers for graph algorithms) and Derek Lim (on graph metanetworks).
๐๏ธ Jan 15th, 5pm CET on Zoom.
๐ Details & sign-up: sites.google.com/view/graph-learning-on-weds.
Ever needed a graph neural network with maximal expressivity on almost all molecules (aka ๐ผ๐๐๐ฒ๐ฟ๐ฝ๐น๐ฎ๐ป๐ฎ๐ฟ graphs)?
Turns out you only need a simple graph transformation called ๐ถ๐ด๐!
Short talk: youtube.com/watch?v=AW6C...
TMLR paper: openreview.net/forum?id=XxbQA
Happening right now.
Join via zoom: rwth.zoom.us/j/6321681001...
TL;DR More expressive GNNs outperform less expressive GNNs not due to expressivity. Pdf Version of the poster, as well as the paper is available at https://pwelke.de
Is Expressivity Essential for the Predictive Performance of Graph Neural Networks?
Spoiler alert: No.
Check out our poster at the Sci4DL workshop, today at 4.30pm, West Meeting Room 205-207
Paper: pwelke.de/publications...
Poster: pwelke.de/publications...
Oral at 10.20 am in West Exhibit Hall C
Poster #3009 at 11am in East Exhibit Hall A-C
Visual depiction of r-lGIN: During preprocessing, we calculate the path neighborhoods Nr (v) for each node v in the graph G. Paths of varying lengths are processed separately using simple GINs, and their embeddings are pooled to obtain the final graph embedding. The forward complexity scales linearly with the sizes of Nr (v), enabling efficient computation on sparse graphs.
Today at NeurIPS:
Weisfeiler and Leman go Loopy: A New Hierarchy for Graph Representational Learning
Cycles are important for predictive tasks on chemical molecules. We allow message passing along neighboring paths. Our architecture can subgraph-count cycles and homomorphism-count cactus graphs.
The @logconference.bsky.social meetup Italy starts today in Siena. I'll certainly be there to enjoy presentations and posters on graph learning. Let's chat!
And of course, I'll enjoy beautiful Siena and amazing food.
sites.google.com/student.unis...
Give yourself an early Christmas present: Visit GLOW and learn about two amazing papers and interact with their authors!
The slides are on my homepage. Official recordings of the event will follow soon.
pwelke.de/presentation...
Had an incredible time at the Learning on Graphs meetup in Paris! Amazing energy, awesome presentations, and lovely posters. I was honored to give a keynote on graph representation learning via homomorphisms and loved the insightful questions and vibrant discussions.