Talk: Nonlinear Sheaf Diffusion in Graph Neural Networks
Monday, 19 February, 2024 - 16:00
DIAG, Room B203
The talk provides an overview of a master's thesis and ongoing project titled "Nonlinear Sheaf Diffusion in Graph Neural Networks". The study focuses on exploring the potential benefits of introducing nonlinearity in the Laplacian of Sheaf Neural Networks when dealing with node classification tasks on graphs. After understanding the motivation, which arises from the area of Opinion Dynamics, we will transition to a theoretical analysis of such nonlinearity and then discuss the practical utility of the proposed technique. The project has been driven by thorough experimental validation, in order to confirm the practical effectiveness of the methodology and guide the design of different versions of the model. The starting point for this project is "Neural Sheaf Diffusion", a previous work by Cristian Bodnar et al., in which a Sheaf Neural Network model is designed to address common issues in Graph Neural Networks, such as oversmoothing and heterophily. Their contributions have served as inspiration for this thesis, opening new research directions in Topological Deep Learning, a field that enhances our understanding of complex data structures from a topological perspective.
Olga Zaghen received her MSc in Artificial Intelligence Systems at University of Trento, where she developed her interest for Geometric Deep Learning. Before that, she earned her BSc in Mathematics at University of Milan. She wrote her MSc thesis on Sheaf Neural Networks at University of Cambridge under the supervision of Prof. Pietro Liò and Prof. Andrea Passerini. She recently completed a research internship at KAIST in the Vision and Learning Laboratory, supervised by Prof. Seunghoon Hong, where she focused on random walks for graph representation learning.