Seminars and Colloquia Schedule

A Combinatorial Description of the knot concordance invariant epsilon

Series
Geometry Topology Seminar
Time
Monday, November 9, 2020 - 14:00 for 1 hour (actually 50 minutes)
Location
Speaker
Hakan DogaUniversity of Buffalo

Computing, understanding the behavior of concordance invariants obtained from knot Floer homology theories is quite central to the study of the concordance group and low-dimensional topology in general. In this talk, I will describe the method that allows us to compute the concordance invariant epsilon using combinatorial knot Floer homology and talk about some computational results. This is a joint work with S. Dey.

Ranking from pairwise comparisons

Series
Undergraduate Seminar
Time
Monday, November 9, 2020 - 15:30 for 1 hour (actually 50 minutes)
Location
Bluejeans meeting: https://gatech.bluejeans.com/759112674
Speaker
Dr. Mao ChengGeorgia Institute of Technology

Ranking items from comparisons is a ubiquitous task in many real-world applications. For example, sports teams can be ranked based on outcomes of matches; students' homework solutions can be ranked based on peer grading. In this lecture, I will discuss: (1) how we can design mathematical models for the problem of ranking or rating a set of items from pairwise comparisons between them; (2) how to do statistical inference based on the models. The model we focus on is the Bradley-Terry model proposed in 1952, which is also related to the Elo rating system implemented for the US Chess Federation in 1960.

Marstrand's Theorem in general Banach spaces

Series
Analysis Seminar
Time
Tuesday, November 10, 2020 - 02:00 for 1 hour (actually 50 minutes)
Location
https://us02web.zoom.us/j/71579248210?pwd=d2VPck1CbjltZStURWRWUUgwTFVLZz09
Speaker
Bobby WilsonUniversity of Washington

We will discuss Marstrand's classical theorem concerning the interplay between density of a measure and the Hausdorff dimension of the measure's support in the context of finite-dimensional Banach spaces. This is joint work with David Bate and Tatiana Toro.

Universal graphs and planarity

Series
Graph Theory Seminar
Time
Tuesday, November 10, 2020 - 12:30 for 1 hour (actually 50 minutes)
Location
https://us04web.zoom.us/j/77238664391. For password, please email Anton Bernshteyn (bahtoh ~at~ gatech.edu)
Speaker
Louis EsperetUniversité Grenoble Alpes

Note the unusual time!

The following are two classical questions in the area of universal graphs.

1. What is the minimum number of vertices in a graph that contains all $n$-vertex planar graphs as induced subgraphs?

2. What is the minimum number of edges in a graph that contains all $n$-vertex planar graphs as subgraphs?

We give nearly optimal constructions for each problem, i.e. with $n^{1+o(1)}$ vertices for Question 1 and $n^{1+o(1)}$ edges for Question 2. The proofs combine a recent structure theorem for planar graphs (of independent interest) with techniques from data structures.

Joint work with V. Dujmovic, C. Gavoille, G. Joret, P. Micek, and P. Morin.

Hodge theory for tropical varieties 1

Series
Algebra Seminar
Time
Wednesday, November 11, 2020 - 15:30 for 1 hour (actually 50 minutes)
Location
ONLINE
Speaker
Matthieu Piquerez

Part 1 of 3-part series

The aim of these two talks is to give an overview of our work on tropical Hodge theory. We show that cohomology groups of smooth projective tropical varieties verify hard Lefschetz property and Hodge-Riemann relations. Providing a description of the Chow groups of matroids in terms of cohomology groups of specific smooth projective tropical varieties, these results can be regarded as a generalization of the work of Adiprasito-Huh-Katz to more general tropical varieties. We also prove that smooth projective tropical varieties verify the analogue in the tropical setting of the weight-monodromy conjecture, affirming a conjecture of Mikhalkin and Zharkov.

BlueJeans link: https://bluejeans.com/476849994

Insights on gradient-based algorithms in high-dimensional non-convex learning

Series
School of Mathematics Colloquium
Time
Thursday, November 12, 2020 - 11:00 for 1 hour (actually 50 minutes)
Location
https://us02web.zoom.us/j/89107379948
Speaker
Lenka ZdeborováEPFL

Gradient descent algorithms and their noisy variants, such as the Langevin dynamics or multi-pass SGD, are at the center of attention in machine learning. Yet their behaviour remains perplexing, in particular in the high-dimensional non-convex setting. In this talk, I will present several high-dimensional and non-convex statistical learning problems in which the performance of gradient-based algorithms can be analysed down to a constant. The common point of these settings is that the data come from a probabilistic generative model leading to problems for which, in the high-dimensional limit, statistical physics provides exact closed solutions for the performance of the gradient-based algorithms. The covered settings include the spiked mixed matrix-tensor model and the phase retrieval.

When Do Neural Networks Outperform Kernel Methods?

Series
Stochastics Seminar
Time
Thursday, November 12, 2020 - 15:30 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/445382510
Speaker
Song MeiUC Berkeley

For a certain scaling of the initialization of stochastic gradient descent (SGD), wide neural networks (NN) have been shown to be well approximated by reproducing kernel Hilbert space (RKHS)  methods. Recent empirical work showed that, for some classification tasks, RKHS methods can replace NNs without a large loss in performance. On the other hand, two-layers NNs are known to encode richer smoothness classes than RKHS and we know of special examples for which SGD-trained NN provably outperform RKHS. This is true also in the wide network limit, for a different scaling of the initialization.

How can we reconcile the above claims? For which tasks do NNs outperform RKHS? If feature vectors are nearly isotropic, RKHS methods suffer from the curse of dimensionality, while NNs can overcome it by learning the best low-dimensional representation. Here we show that this curse of dimensionality becomes milder if the feature vectors display the same low-dimensional structure as the target function, and we precisely characterize this tradeoff. Building on these results, we present a model that can capture in a unified framework both behaviors observed in earlier work. We hypothesize that such a latent low-dimensional structure is present in image classification. We test numerically this hypothesis by showing that specific perturbations of the training distribution degrade the performances of RKHS methods much more significantly than NNs. 

Chain conditions in power series and polynomial rings

Series
Student Algebraic Geometry Seminar
Time
Friday, November 13, 2020 - 09:00 for 1 hour (actually 50 minutes)
Location
Microsoft Teams Meeting
Speaker
Hamed MousaviGeorgia Tech

Following the Hilbert Basis theorem and its applications, there has been a vast variety of studies involving the chain conditions over the polynomial or the power series rings. One type of chain condition is the Archimedean condition, which says \cap_n Rt_n = 0for any nonunit element t in the ring R. In this talk, we start with the ascending chain condition on principal ideals (ACCP) over a larger class “skew generalized power series rings”. Then we explain the relation between ACCP rings and Archimedean rings and answer partially to the question “when these properties can be lifted from the ring R to the ring R[[x; α]]? ” In particular we show that if R is an Archimedean reduced ring and satisfy ACC on annihilators, then R[[x]] is also an Archimedean reduced ring.

The lattice metric space and its applications

Series
Research Horizons Seminar
Time
Friday, November 13, 2020 - 11:30 for 1 hour (actually 50 minutes)
Location
ONLINE
Speaker
Yuchen HeGeorgia Tech
Lattice patterns are commonly observed in material sciences where microscopic structural nuances induce distinct macroscopic physical or chemical properties. Provided with two lattices of the same dimension, how do we measure their differences in a visually consistent way? Mathematically, any n-D lattice is determined by a set of n independent vectors. Since such basis-representation is non-unique, a direct comparison among basis-representations in Euclidean space is highly ambiguous. In this talk, I will focus on 2-D lattices and introduce the lattice metric space proposed in my earlier work. This geometric space was constructed mainly based on integrating the Modular group theory and the Poincaré metric. In the lattice metric space, each point represents a unique lattice pattern, and the visual difference between two patterns is measured by the shortest path connecting them. Some applications of the lattice metric space will be presented. If time allows, I will briefly discuss potential extensions to 3D-lattices.

Approximation Algorithms for Mixed Integer Non-Linear Optimization Problems

Series
ACO Student Seminar
Time
Friday, November 13, 2020 - 13:00 for 1 hour (actually 50 minutes)
Location
ONLINE
Speaker
Guanyi WangISyE, Georgia Tech

For computational-intensive mixed integer non-linear optimization problems, a major challenge is to verify/guarantee the quality of any feasible solution under mild assumptions in a tractable fashion. In this talk, we focus on tackling this challenge by constructing tight relaxations and designing approximation algorithms for two different mixed integer non-linear optimization problems.

In the first part, we focus on the (row) sparse principal component analysis (rsPCA) problem. Solving rsPCA is the problem of finding the top-r leading principal components of a covariance matrix such that all these principal components are linear combinations of a subset of k variables. The rsPCA problem is a widely used dimensionality reduction tool with an additional sparsity constraint to enhance its interpretability. We propose: (a) a convex integer programming relaxation of rsPCA that gives upper (dual) bounds for rsPCA, and; (b) a new local search algorithm for finding primal feasible solutions for rsPCA. We also show that, in the worst-case, the dual bounds provided by the convex IP are within an affine function of the global optimal value. We demonstrate our techniques applied to large-scale covariance matrices.

In the second part, we focus on improving the execution speed of compute-intensive numerical code. The compute-intensive numerical code, especially of the variety encountered in deep neural network inference and training, is often written using nested for-loops. One of the main bottlenecks that significantly influence the nested for-loops' execution speed is the so-called memory latency. Iteration space tiling is a common memory management technique used to deal with memory latency. We study the problem of automatically optimizing the implementation of these nested loops by formulating the iteration space tiling problem into an integer geometric programming (IGP) problem. We show how to design an efficient approximation algorithm for this problem and how to use the so-called "non-uniform tiling" technique to improve the execution speed.

The first part of the talk is joint work with Santanu S. Dey, Rahul Mazumder, Macro Molinaro, and the second part of the talk is joint work with Ofer Dekel.