Seminars and Colloquia by Series

Prague dimension of random graphs

Series
ACO Student Seminar
Time
Friday, November 20, 2020 - 13:00 for 1 hour (actually 50 minutes)
Location
ONLINE
Speaker
Kalen PattonMath, Georgia Tech

Various notions of dimension are important throughout mathematics, and for graphs the so-called Prague dimension was introduced by Nesetril, Pultr and Rodl in the 1970s. Proving a conjecture of Furedi and Kantor, we show that the Prague dimension of the binomial random graph is typically of order $n/\log n$ for constant edge-probabilities. The main new proof ingredient is a Pippenger-Spencer type edge-coloring result for random hypergraphs with large uniformities, i.e., edges of size $O(\log n)$.

Based on joint work with He Guo and Lutz Warnke.

Approximation Algorithms for Mixed Integer Non-Linear Optimization Problems

Series
ACO Student Seminar
Time
Friday, November 13, 2020 - 13:00 for 1 hour (actually 50 minutes)
Location
ONLINE
Speaker
Guanyi WangISyE, Georgia Tech

For computational-intensive mixed integer non-linear optimization problems, a major challenge is to verify/guarantee the quality of any feasible solution under mild assumptions in a tractable fashion. In this talk, we focus on tackling this challenge by constructing tight relaxations and designing approximation algorithms for two different mixed integer non-linear optimization problems.

In the first part, we focus on the (row) sparse principal component analysis (rsPCA) problem. Solving rsPCA is the problem of finding the top-r leading principal components of a covariance matrix such that all these principal components are linear combinations of a subset of k variables. The rsPCA problem is a widely used dimensionality reduction tool with an additional sparsity constraint to enhance its interpretability. We propose: (a) a convex integer programming relaxation of rsPCA that gives upper (dual) bounds for rsPCA, and; (b) a new local search algorithm for finding primal feasible solutions for rsPCA. We also show that, in the worst-case, the dual bounds provided by the convex IP are within an affine function of the global optimal value. We demonstrate our techniques applied to large-scale covariance matrices.

In the second part, we focus on improving the execution speed of compute-intensive numerical code. The compute-intensive numerical code, especially of the variety encountered in deep neural network inference and training, is often written using nested for-loops. One of the main bottlenecks that significantly influence the nested for-loops' execution speed is the so-called memory latency. Iteration space tiling is a common memory management technique used to deal with memory latency. We study the problem of automatically optimizing the implementation of these nested loops by formulating the iteration space tiling problem into an integer geometric programming (IGP) problem. We show how to design an efficient approximation algorithm for this problem and how to use the so-called "non-uniform tiling" technique to improve the execution speed.

The first part of the talk is joint work with Santanu S. Dey, Rahul Mazumder, Macro Molinaro, and the second part of the talk is joint work with Ofer Dekel.

$k$-planar crossing numbers and the midrange crossing constant

Series
ACO Student Seminar
Time
Friday, October 30, 2020 - 13:00 for 1 hour (actually 50 minutes)
Location
Online
Speaker
Dr. Zhiyu WangMath, Georgia Tech

The crossing number of a graph is the minimum number of crossings it can be drawn in a plane. Let $\kappa(n, m)$ be the minimum crossing number of graphs with $n$ vertices and (at least) $m$ edges. Erd\H{o}s and Guy conjectured and Pach, Spencer and T\'oth proved that for any $m = m(n)$ satisfying $n \ll m \ll n^2$, the quatity $\ds\lim_{n \to \infty} \frac{\kappa(n,m) n^2}{m^3}$ exists and is positive. The $k$-planar crossing number of a graph is the minimum crossing number obtained when we partition the edges of the graph into $k$ subgraphs and draw them in $k$ planes. Using designs and a probabilistic algorithm, the guaranteed factor of improvement $\alpha_k$ between the $k$-planar and regular crossing number is $\frac{1}{k^2} (1 + o(1))$, while if we restrict our attention to biplanar graphs, this constant is $\beta_k = \frac{1}{k^2}$ exactly. The lower bound proofs require the existence of a midrange crossing constant. Motivated by this, we show that the midrange crossing constant exists for all graph classes (including bipartite graphs) that satisfy certain mild conditions. The regular midrange crossing constant was shown to be is at most $\frac{8}{9\pi^2}$; we present a probabilistic construction that also shows this bound.
 

The Sunflower Problem

Series
ACO Student Seminar
Time
Friday, October 23, 2020 - 13:00 for 1 hour (actually 50 minutes)
Location
Online
Speaker
Tolson BellMath, Georgia Tech

A sunflower with p petals consists of p sets whose pairwise intersections are all the same set. The goal of the sunflower problem is to find the smallest r = r(p,k) such that every family of at least r^k k-element sets must contain a sunflower with p petals. Major breakthroughs within the last year by Alweiss-Lovett-Wu-Zhang and others show that r = O(p log(pk)) suffices. In this talk, after reviewing the history and significance of the Sunflower Problem, I will present our improvement to r = O(p log k), which we obtained during the 2020 REU at Georgia Tech. As time permits, I will elaborate on key lemmas and techniques used in recent improvements.

Based on joint work with Suchakree Chueluecha (Lehigh University) and Lutz Warnke (Georgia Tech), see https://arxiv.org/abs/2009.09327

Hyperbolic Relaxations of Locally Positive Semidefinite Matrices

Series
ACO Student Seminar
Time
Friday, October 9, 2020 - 13:00 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/264244877/0166
Speaker
Kevin ShuMath, Georgia Tech

Semidefinite programming is a powerful optimization tool, which involves optimizing linear functions on a slice of the positive semidefinite matrices. Locally PSD matrices are a natural relaxation of the PSD matrices which can be useful in reducing the space required for semidefinite optimization. We use the theory of hyperbolic polynomials to give precise quantitative bounds on the quality of the approximation resulting from optimizing over the locally-psd cone instead of the PSD cone.

ACO Alumni Lecture series: Algorithms for minimum norm combinatorial optimization

Series
ACO Student Seminar
Time
Friday, October 2, 2020 - 13:00 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/264244877/0166
Speaker
Dr. Deepernab ChakrabartyCS, Dartmouth College

In many optimization problems, a feasible solution induces a multi-dimensional cost vector. For example, in load-balancing a schedule induces a load vector across the machines. In k-clustering, opening k facilities induces an assignment cost vector across the clients. In this paper we consider the following minimum norm optimization problem : given an arbitrary monotone, symmetric norm, find a solution which minimizes the norm of the induced cost-vector. This generalizes many fundamental NP-hard problems. We give a general framework to tackle the minimum norm problem, and illustrate its efficacy in load balancing and, time permitting, in the clustering setting.

Minimal problems in 3D reconstruction

Series
ACO Student Seminar
Time
Friday, September 25, 2020 - 13:00 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/264244877/0166
Speaker
Timothy DuffMath, Georgia Tech

I describe my ongoing work using tools from computational and combinatorial algebraic geometry to classify minimal problems and identify which can be solved efficiently. I will not assume any background in algebraic geometry or computer vision.

Structure-from-motion algorithms reconstruct a 3D scene from many images, often by matching features (such as point and lines) between the images. Matchings lead to constraints, resulting in a nonlinear system of polynomial equations that recovers the 3D geometry. Since many matches are outliers, these methods are used in an iterative framework for robust estimation called RANSAC (RAndom SAmpling And Consensus), whose efficiency hinges on using a small number of correspondences in each iteration. As a result, there is a big focus on constructing polynomial solvers for these "minimal problems" that run as fast as possible. Our work classifies these problems in cases of practical interest (calibrated cameras, complete and partial visibility.) Moreover, we identify candidates for practical use, as quantified by "algebraic complexity measures" (degree, Galois group.)

joint w/ Anton Leykin, Kathlen Kohn, Tomas Pajdla arxiv1903.10008 arxiv2003.05015+ Viktor Korotynskiy, TP, and Margaret Regan (ongoing.)

New Algorithms for Generalized Min Sum Set Cover

Series
ACO Student Seminar
Time
Friday, September 18, 2020 - 13:00 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/264244877/0166
Speaker
Majid FarhadiCS, Georgia Tech

We present a new rounding framework and improve the approximation bounds for min sum vertex cover and generalized min sum set cover, also known as multiple intents re-ranking problem. These classical combinatorial optimization problems find applications in scheduling, speeding up semidefinite-program solvers, and query-results diversification, among others.

Our algorithm is based on transforming the LP solution by a suitable kernel and applying a randomized rounding. It also gives a linear-programming based 4-approximation algorithm for min sum set cover, i.e., best possible due to Feige, Lovász, and Tetali. As part of the analysis of our randomized algorithm we derive an inequality on the lower tail of a sum of independent Bernoulli random variables, which may be of independent interest.

Joint work with Nikhil Bansal, Jatin Batra, and Prasad Tetali. [arXiv:2007.09172]

Online Selection with Cardinality Constraints under Bias

Series
ACO Student Seminar
Time
Friday, September 11, 2020 - 13:00 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/264244877/0166
Speaker
Jad SalemMath, Georgia Tech

Optimization and machine learning algorithms often use real-world data that has been generated through complex socio-economic and behavioral processes. This data, however, is noisy, and naturally encodes difficult-to-quantify systemic biases. In this work, we model and address bias in the secretary problem, which has applications in hiring. We assume that utilities of candidates are scaled by unknown bias factors, perhaps depending on demographic information, and show that bias-agnostic algorithms are suboptimal in terms of utility and fairness. We propose bias-aware algorithms that achieve certain notions of fairness, while achieving order-optimal competitive ratios in several settings.

CANCELLED: Online Selection with Cardinality Constraints under Bias

Series
ACO Student Seminar
Time
Friday, March 13, 2020 - 13:05 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Jad SalemMath, Georgia Tech
Optimization and machine learning algorithms often use real-world data that has been generated through complex socio-economic and behavioral processes. This data, however, is noisy, and naturally encodes difficult-to-quantify systemic biases. In this work, we model and address bias in the secretary problem, which has applications in hiring. We assume that utilities of candidates are scaled by unknown bias factors, perhaps depending on demographic information, and show that bias-agnostic algorithms are suboptimal in terms of utility and fairness. We propose bias-aware algorithms that achieve certain notions of fairness, while achieving order-optimal competitive ratios in several settings.
 

Pages