Praise the sun! I am currently a Senior Research Associate in the Department of Computer Science at the University of Oxford, working with Michael Bronstein. Previously, I was a Research Associate at the University of Cambridge and I spent a year at Twitter Cortex as an ML Researcher. I finished my PhD in Mathematics at UCL with a thesis on analysis of singularity formation of rotationally symmetric Ricci Flows. I am now interested in Geometric Deep Learning and working on both theoretical foundations and applications in the scientific domain.
My current lines of research include: (i) Understanding how information flows in message passing models and the associated phenomenon of over-squashing using quantities such as curvature and commute time—our first step in this direction got an ICLR honorable mention. More recently, we have introduced a novel paradigm for studying the expressive power of message-passing models (on graphs and/or point clouds) that precisely depends on the ability to induce interactions (mixing) across different nodes (points). (ii) Investigating how Graph Neural Networks “use” the underlying graph topology and to what extent we need to rely on the same input graph to exchange messages across layers. This direction falls into the field of graph-rewiring: I am working on applying frameworks inspired by these ideas to applied problems with the aim of replacing expensive Transformer-style architectures. A recent work explores this direction introducing novel notions of delay and distance-aware skip connections to mitigate vanishing gradients issues and deal with long-range interactions. (iii) Investigating how the geometry of the data can be better leveraged when designing generative modelling approaches (Flow Matching and Diffusion Models). (iv) Analyzing how the training dyamics is affected when enforcing symmetries and/or exact constraints.
Contact: francesco.di.giovanni at cs (dot) ox (dot) ac (dot) uk
- August 2023: Keynote speaker at the Maths for GDL workshop, ICIAM
- August 2023: Our work on understanding graph-convolutions through energies got accepted at TMLR
- June 2023: New paper out on characterising expressive power of message passing models through their mixing abilities
- June 2023: Invited speaker at the CECAM/Psi-k conference on `Bridging length scales with machine learning: from wavefunctions to thermodynamics’
- April: Our new framework for message-passing with delay got accepted at ICML23
- April: Our new theoretical work on over-squashing got accepted at ICML23
- March: reviewer for ICML23
- December: Invited panelist at LoG tutorial on Graph-rewiring and Fairness
- December: Keynote speaker at Neurips 2022 Workshop: New Frontiers on Graph Learning
- August: LoGaG reading group, invited talk
- August: MML seminar at UCLA, invited talk
- August: Stanford GNN Reading Group, invited talk
- August: I gave a long talk at the Hammers and Nails 2022 Workshop in Tel Aviv
- July: I have taught at the First Italian School in Geometric Deep Learning
- July: I have been a mentor at the LOGML22 - our project is about graph-rewiring using geometric exploration policies
- June: I have reviewed for NeurIPS 2022
- May: I have coauthored a blogpost with Michael Bronstein and Cristian Bodnar on a recent about cellular sheaf theory for tackling heterophily in GNNs
- April: Our work on understanding over-squashing in GNNs through graph curvature has got an Outstanding Paper Honorable Mention at ICLR 2022!
- April: Aleksa Gordić made a great video about our paper on bottlenecks and over-squashing in GNNs. This is highly recommended to anyone interested in understanding our work in quite some detail!
- March: I gave a talk at the Dagsthul seminar Graph Embeddings: Theory meets Practice
- January: I have been invited to write my opinion on future perspetives of GNNs in a blogpost authored by Michael Bronstein and Petar Veličković, featuring many prominent researchers in the field.