Laplacian Paradigm 2.0

Date/Time: Saturday, October 6th, 2018. 8:40 AM – 3:00 PM.
Location: Amphi Darboux, Institut Henri Poincaré, Paris, France
Organizers: Richard Peng (GaTech) and Sushant Sachdeva (UToronto)
Website credits: Junxing Wang (CMU)

Synopsis: Spectral algorithms originated with the use of eigenvalues and eigenvectors to analyze graphs, and have wide applications in machine learning, image processing, and network science. Over the past decade, the synergy of spectral algorithms with tools from scientific computing and combinatorial graph theory has led to the Laplacian paradigm for designing graph algorithms. For a wide range of fundamental graph problems such as max-flow, stationary distributions of random walks, or even shortest paths with general weights, the current best algorithms on sparse graphs utilize ideas related to graph Laplacians.

A common theme among algorithms usually described as ‘spectral algorithms’ – including foundational algorithms in the Laplacian paradigm – has been a clear transition between combinatorial and numerical components. This delineation is exemplified by spectral graph partitioning, which first computes eigenvectors of the graph Laplacian numerically, then clusters them using geometry and combinatorics.

On the other hand, a hallmark of recent progresses in linear systems, optimization, and numerical problems broadly related to graph Laplacians is a tighter, more fine-grained integration of combinatorial and numerical components. These algorithms are built on deeper understanding of concepts introduced by earlier works in the Laplacian paradigm, such as effective resistances of edges, Schur complements, and graph (re)sparsification.

This workshop will survey the recent progress, with a focus on the rich connections exposed by the new ways of combining algorithmic constructs.


Laplacian Paradigm 2.0: Merging Continuous and Discrete

Richard Peng (GaTech) 08:40 -- 09:10

Abstract: Over the past three decades, works related to efficient solvers for a class of graph structured matrices, graph Laplacians, led to fundamental results in combinatorial optimization and scientific computing as well as the Laplacian paradigm for graph algorithms. In this talk I’ll survey these progresses, with focus on the underlying algorithm design approaches. Specifically, I will discuss the origin of the Laplacian paradigm in combining combinatorial and numerical building blocks via spectral graph theory, and describe recent efforts on designing new tools tailored towards the overall algorithmic questions.

Linear Algebraic Primitives Beyond Laplacian Solvers

Aaron Sidford (Stanford) 09:10 -- 09:50

Abstract: Over the past decade nearly linear time Laplacian system solving has emerged as powerful algorithmic hammer in improving asymptotic running times of solving a wide variety of problems. Both the black-box application of Laplacian system solvers and the application of the algorithmic techniques underlying recent efficient Laplacian system solving algorithms have had broad implications in algorithm design. In this talk I will discuss recent progress in advancing this machinery beyond solving Laplacians systems towards solving broader classes of structured linear systems in nearly linear time. This talk will focus on recent advances in solving directed Laplacians (and more broadly M-matrices) in nearly linear time through efficient reductions to solving row-column diagonally dominant (RCDD) systems. Further, this talk will connect these problems to a variety of closely related linear algebraic problems including PageRank computation, policy evaluation, stationary distribution computation, Perron-Vectors computation, and more.

Approximate Gaussian elimination and applications

Sushant Sachdeva (UToronto) 09:50 -- 10:30

Abstract: Gaussian elimination is the best known algorithm for solving systems of linear equations, and one of the oldest algorithms known. However, it is ill-suited for solving sparse systems since even for sparse Laplacian matrices, it can require running time that is super-quadratic in the input size. Several previous approaches speed up Gaussian elimination by introducing error, but fail to provide guarantees. Over the last few years, new versions of approximate Gaussian elimination have been developed for graph Laplacians that provide provable approximation guarantees, and run in time nearly-linear in the sparsity of the graph. In this talk, we will discuss these approaches, and how they lead to 1) simpler nearly-linear time Laplacian solvers, 2) nearly-linear time solvers for systems strictly more general than Laplacians, and 3) faster estimation of all pair effective resistances (which implies faster algorithms for estimating the determinant of a Laplacian).

Analysing approximate Gaussian elimination using matrix martingales

Rasmus Kyng (Harvard) 11:00 -- 12:00

pdf | ppt
Abstract: The sum of a sequence of random variables is a martingale, if each variable has zero mean, conditioned on all the preceding variables. Concentration results for matrix-valued martingales have become an important tool in the analysis of algorithms in randomized numerical linear algebra. Simple and fast algorithms for many well-studied problems can be analyzed in using martingales. We will explore recent results on using matrix martingales for analyzing approximate Gaussian elimination on Laplacian matrices associated with undirected and directed graphs, and survey briefly other uses of matrix martingales in spectral graph theory.

Graph Structure via Exact Gaussian Elimination

Aaron Schild (UC Berkeley) 14:00 -- 15:00

Abstract: Graph partitioning has played an important role in theoretical computer science, particularly in the design of approximation algorithms and metric embeddings. In this talk, we revisit some of these applications from the perspective of Gaussian elimination, particularly in the context of random spanning tree generation and oblivious routing. In the process, we will see how fundamental tradeoffs in graph partitioning can be overcome by eliminating vertices from an undirected graph using Gaussian elimination on the associated Laplacian matrix.