Bounds on eigenvalues of graphs and some applications of neural networks
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Many real-world data come with an inherent relationship structure between the data points that is encoded in the form of graphs/networks. Whereas graph theory studies the topology and combinatorics of graphs, graph signal processing studies signals on graphs and their variations. These two fields are intimately connected by the Laplacian operator, which is an operator acting on signals, and its eigenvalues provide information about the graph structure. The pseudo-inverse of the Laplacian is closely related to effective resistance on the graphs. We introduce a switch identity to simplify the effective resistance computation, similar to series and parallel rules in circuits. We consider the strength of graphs, which is the solution to an optimization problem, and find bounds on the eigenvalues of the Laplacian operator using the strength. We also train graph neural network models to compute the strength. For this task, we use various node features and, using saliency analysis, show that resistance curvature has a strong connection with the strength of the graphs. To further illustrate the applications of graph neural networks in mathematics, we train graph neural networks to learn the influence probabilities for the site percolation model.