Towards Neural Sparse Linear Solvers

14 Mar 2022  ·  Luca Grementieri, Paolo Galeone ·

Large sparse symmetric linear systems appear in several branches of science and engineering thanks to the widespread use of the finite element method (FEM). The fastest sparse linear solvers available implement hybrid iterative methods. These methods are based on heuristic algorithms to permute rows and columns or find a preconditioner matrix. In addition, they are inherently sequential, making them unable to leverage the GPU processing power entirely. We propose neural sparse linear solvers, a deep learning framework to learn approximate solvers for sparse symmetric linear systems. Our method relies on representing a sparse symmetric linear system as an undirected weighted graph. Such graph representation is inherently permutation-equivariant and scale-invariant, and it can become the input to a graph neural network trained to regress the solution. We test neural sparse linear solvers on static linear analysis problems from structural engineering. Our method is less accurate than classic algorithms, but it is hardware-independent, fast on GPUs, and applicable to generic sparse symmetric systems without any additional hypothesis. Although many limitations remain, this study shows a general approach to tackle problems involving sparse symmetric matrices using graph neural networks.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods