Seminars and Colloquia by Series

Reimagining Spectral Graph Theory

Series
ACO Alumni Lecture
Time
Wednesday, September 27, 2023 - 16:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Stephen Young Pacific Northwest National Laboratory

Abstract: While spectral methods provide far-ranging insights on graph structure, there remain significant challenges in their application to real data. Most notably, spectral methods do not incorporate information that maybe available beyond adjacency.  A common approach to incorporating such additional information is encode this information in an ad-hoc manner into weights associated with the edges. Not only does this have limited expressivity, but is also restricted by graph structure: if two vertices are not adjacent, then edge weights cannot capture any closeness implied by metadata.

We address this issue by introducing the inner product Hodge Laplacian for an arbitrary simplicial complex.  Within this framework we prove generalizations of foundational results in spectral graph theory, such as the Cheeger inequality and the expander mixing lemma, and show our framework recovers the usual combinatorial and normalized Laplacians as special cases. Our framework allows for the principled synthesis of combinatorial approaches in network science with more metadata driven approaches by using latent space encodings of the metadata to define an inner product both the vertices and the edges.

(Coffee will be available at 3:30 before this talk, following the speaker's Professional Development Seminar at 2:30pm.)

 

Algorithms for minimum norm combinatorial optimization

Series
ACO Alumni Lecture
Time
Friday, October 2, 2020 - 13:05 for 1 hour (actually 50 minutes)
Location
Bluejeans link: https://bluejeans.com/264244877/0166
Speaker
Deeparnab ChakrabartyDartmouth College, NH

In many optimization problems, a feasible solution induces a multi-dimensional cost vector. For example, in load-balancing a schedule induces a load vector across the machines. In k-clustering, opening k facilities induces an assignment cost vector across the clients. In this paper we consider the following minimum norm optimization problem : given an arbitrary monotone, symmetric norm, find a solution which minimizes the norm of the induced cost-vector. This generalizes many fundamental NP-hard problems. We give a general framework to tackle the minimum norm problem, and illustrate its efficacy in load balancing and, time permitting, in the clustering setting.

(The speaker is an ACO alum; after the lecture, the speaker will engage with the ACO students for 30-45 minutes.)

Clustered coloring for old coloring conjectures

Series
ACO Alumni Lecture
Time
Thursday, March 14, 2019 - 16:30 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Chun-Hung LiuTexas A&M

Hadwiger (Hajos and Gerards and Seymour, respectively) conjectured that the vertices of every graph with no K_{t+1} minor (topological minor and odd minor, respectively) can be colored with t colors such that any pair of adjacent vertices receive different colors. These conjectures are stronger than the Four Color Theorem and are either wide open or false in general. A weakening of these conjectures is to consider clustered coloring which only requires every monochromatic component to have bounded size instead of size 1. It is known that t colors are still necessary for the clustered coloring version of those three conjectures. Joint with David Wood, we prove a series of tight results about clustered coloring on graphs with no subgraph isomorphic to a fixed complete bipartite graph. These results have a number of applications. In particular, they imply that the clustered coloring version of Hajos' conjecture is true for bounded treewidth graphs in a stronger sense: K_{t+1} topological minor free graphs of bounded treewidth are clustered t-list-colorable. They also lead to the first linear upper bound for the clustered coloring version of Hajos' conjecture and the currently best upper bound for the clustered coloring version of the Gerards-Seymour conjecture.

The power of parallelization in large-scale convex optimization

Series
ACO Alumni Lecture
Time
Tuesday, November 27, 2018 - 12:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Cristobal GuzmanUniversidad Católica de Chile, Chile

Recently there has been an outburst of parallelization techniques to speed up optimization algorithms, particularly in applications in statistical learning and structured linear programs. Motivated by these developments, we seek for theoretical explanations of provable improvements (or the lack thereof) in performance obtained by parallelizing optimization algorithms. In 1994, Nemirovski proved that for low-dimensional optimization problems there is a very limited improvement that could be obtained by parallelization, and furthermore conjectured that no acceleration should be achievable by these means. In this talk, I will present new results showing that in high-dimensional settings no acceleration can be obtained by parallelization, providing strong evidence towards Nemirovski's conjecture. This is joint work with Jelena Diakonikolas (UC Berkeley).