Seminars and Colloquia by Series

Numerical methods for solving nonlinear PDEs from homotopy methods to machine learning

Series
Applied and Computational Mathematics Seminar
Time
Monday, October 12, 2020 - 14:00 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/884917410
Speaker
Wenrui HaoPenn State University

Many systems of nonlinear PDEs are arising from engineering and biology and have attracted research scientists to study the multiple solution structure such as pattern formation. In this talk, I will present several methods to compute the multiple solutions of nonlinear PDEs. In specific, I will introduce the homotopy continuation technique to compute the multiple steady states of nonlinear differential equations and also to explore the relationship between the number of steady-states and parameters. Then I will also introduce a randomized Newton's method to solve the nonlinear system arising from neural network discretization of the nonlinear PDEs. Several benchmark problems will be used to illustrate these ideas.

A contact invariant from bordered Heegaard Floer homology

Series
Geometry Topology Seminar
Time
Monday, October 12, 2020 - 14:00 for 1 hour (actually 50 minutes)
Location
https://dartmouth.zoom.us/j/98031035804?pwd=NnBpTlhVS2lzVzFWTkYyTlloeWVuQT09
Speaker
Ina PetkovaDartmouth

Given a contact structure on a bordered 3-manifold, we describe an invariant which takes values in the bordered sutured Floer homology of the manifold. This invariant satisfies a nice gluing formula, and recovers the Oszvath-Szabo contact class in Heegaard Floer homology. This is joint work with Alishahi, Foldvari, Hendricks, Licata, and Vertesi.

Zoom info:

Meeting ID: 980 3103 5804

Passcode: 196398

Discrepancy Minimization via a Self-Balancing Walk

Series
Combinatorics Seminar
Time
Friday, October 9, 2020 - 15:00 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/751242993/PASSWORD (To receive the password, please email Lutz Warnke)
Speaker
Yang P. LiuStanford University

We study discrepancy minimization for vectors in R^n under various settings. The main result is the analysis of a new simple random process in multiple dimensions through a comparison argument. As corollaries, we obtain bounds which are tight up to logarithmic factors for several problems in online vector balancing posed by Bansal, Jiang, Singla, and Sinha (STOC 2020), as well as linear time algorithms for logarithmic bounds for the Komlós conjecture.

Based on joint work with Alweiss and Sawhney, see https://arxiv.org/abs/2006.14009

Hyperbolic Relaxations of Locally Positive Semidefinite Matrices

Series
ACO Student Seminar
Time
Friday, October 9, 2020 - 13:00 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/264244877/0166
Speaker
Kevin ShuMath, Georgia Tech

Semidefinite programming is a powerful optimization tool, which involves optimizing linear functions on a slice of the positive semidefinite matrices. Locally PSD matrices are a natural relaxation of the PSD matrices which can be useful in reducing the space required for semidefinite optimization. We use the theory of hyperbolic polynomials to give precise quantitative bounds on the quality of the approximation resulting from optimizing over the locally-psd cone instead of the PSD cone.

Introduction to Kajiwara-Payne Tropicalization II

Series
Student Algebraic Geometry Seminar
Time
Friday, October 9, 2020 - 09:00 for 1 hour (actually 50 minutes)
Location
Microsoft Teams: https://teams.microsoft.com/l/meetup-join/19%3a3a9d7f9d1fca4f5b991b4029b09c69a1%40thread.tacv2/1601996938961?context=%7b%22Tid%22%3a%22482198bb-ae7b-4b25-8b7a-6d7f32faa083%22%2c%22Oid%22%3a%22dc6c6c03-84d2-497a-95c0-d85af9cbcf28%22%7d
Speaker
Trevor GunnGeorgia Tech

The goal of this talk is to present a summary of Sam Payne's 2009 paper "Analytification is the limit of all tropicalizations" (Math. Res. Lett. 16, no. 3 543–556). We will introduce Berkovich analytic spaces, tropicalization of projective varieties, and tropicalization of closed subvarieties of toric varieties, as well as the connections between these concepts. We will try to present many examples.

Note: Part I will focus on tropicalization of affine varieties and Berkovich analytic spaces, Part II will focus on tropicalization of toric varieties and discuss Sam Payne's theorem.

Approximate Kernel Principal Component Analysis: Computational vs. Statistical Trade-off

Series
Stochastics Seminar
Time
Thursday, October 8, 2020 - 15:30 for 1 hour (actually 50 minutes)
Location
https://gatech.webex.com/gatech/j.php?MTID=mdd4512d3d11623149a0bd46d9fc086c8
Speaker
Bharath SriperumbudurPennsylvania State University

Kernel principal component analysis (KPCA) is a popular non-linear dimensionality reduction technique, which generalizes classical linear PCA by finding functions in a reproducing kernel Hilbert space (RKHS) such that the function evaluation at a random variable $X$ has a maximum variance. Despite its popularity, kernel PCA suffers from poor scalability in big data scenarios as it involves solving a $n \times  n$ eigensystem leading to the computational complexity of $O(n^3)$ with $n$ being the number of samples. To address this issue, in this work, we consider a random feature approximation to kernel PCA which requires solving an $m \times m$ eigenvalue problem and therefore has a computational complexity of $O(m^3+nm^2)$, implying that the approximate method is computationally efficient if $m$ < $n$ with $m$ being the number of random features. The goal of this work is to investigate the trade-off between computational and statistical behaviors of approximate KPCA, i.e., whether the computational gain is achieved at the cost of statistical efficiency. We show that the approximate KPCA is both computationally and statistically efficient compared to KPCA in terms of the error associated with reconstructing a kernel function based on its projection onto the corresponding eigenspaces.

Link to Cisco Webex meeting: https://gatech.webex.com/gatech/j.php?MTID=mdd4512d3d11623149a0bd46d9fc086c8

Topology of cable knots

Series
Geometry Topology Student Seminar
Time
Wednesday, October 7, 2020 - 14:00 for 1 hour (actually 50 minutes)
Location
Speaker
Hyunki MinGeorgia Tech

Cabling is one of important knot operations. We study various properties of cable knots and how to characterize the cable knots by its complement.

Inducibility of graphs and tournaments

Series
Graph Theory Seminar
Time
Tuesday, October 6, 2020 - 15:45 for 1 hour (actually 50 minutes)
Location
https://us04web.zoom.us/j/77238664391. For password, please email Anton Bernshteyn (bahtoh ~at~ gatech.edu)
Speaker
Florian PfenderUniversity of Colorado Denver

A classical question in extremal graph theory asks to maximize the number of induced copies of a given graph or tournament in a large host graph, often expressed as a density. A simple averaging argument shows that the limit of this density exists as the host graph is allowed to grow. Razborov's flag algebra method is well suited to generate bounds on these quantities with the help of semidefinite programming. We will explore this method for a few small examples, and see how to modify it to fit our questions. The extremal graphs show some beautiful structures, sometimes fractal like, sometimes quasi random and sometimes even a combination of both.

An enhanced uncertainty principle

Series
Analysis Seminar
Time
Tuesday, October 6, 2020 - 14:00 for 1 hour (actually 50 minutes)
Location
https://us02web.zoom.us/j/71579248210?pwd=d2VPck1CbjltZStURWRWUUgwTFVLZz09
Speaker
Joaquim Ortega-CerdaUniversity of Barcelona

We improve on some recent results of Sagiv and Steinerberger that quantify the following uncertainty principle: for a function f with mean zero, then either the size of the zero set of the function or the cost of transporting the mass of the positive part of f to its negative part must be big. We also provide a sharp upper estimate of the transport cost of the positive part of an eigenfunction of the Laplacian.

This proves a conjecture of Steinerberger and provides a lower bound of the size of a nodal set of the eigenfunction. Finally, we use a similar technique to provide a measure of how well the points in a design in a manifold are equidistributed. This is a joint work with Tom Carroll and Xavier Massaneda.

Regression of functions on a low-dimensional set by neural networks

Series
Undergraduate Seminar
Time
Monday, October 5, 2020 - 15:30 for 1 hour (actually 50 minutes)
Location
Bluejeans meeting https://bluejeans.com/759112674
Speaker
Dr. Wenjing LiaoGeorgia Tech

Many data set in image analysis and signal processing is in a high-dimensional space but exhibit low-dimensional structures. For example, data can be modeled as point clouds in a high-dimensional space but are concentrated on a low-dimensional set (or manifold in particular). Our goal is to estimate functions on the low-dimensional manifold from finite samples of data, for statistical inference and prediction. This talk introduces approximation theories of neural networks for functions supported on a low-dimensional manifold. When the function is estimated from finite samples, we give an estimate of the mean squared error for the approximation of these functions. The convergence rate depends on the intrinsic dimension of the manifold instead of the ambient dimension of the data. These results demonstrate that neural networks are adaptive to low-dimensional geometric structures of data.

Pages