Seminars and Colloquia by Series

Dynamical critical 2d first-passage percolation

Series
Stochastics Seminar
Time
Thursday, March 3, 2022 - 15:30 for 1 hour (actually 50 minutes)
Location
ONLINE
Speaker
David HarperGeorgia Tech

In first-passage percolation (FPP), we let \tau_v be i.i.d. nonnegative weights on the vertices of a graph and study the weight of the minimal path between distant vertices. If F is the distribution function of \tau_v, there are different regimes: if F(0) is small, this weight typically grows like a linear function of the distance, and when F(0) is large, the weight is typically of order one. In between these is the critical regime in which the weight can diverge, but does so sublinearly. This talk will consider a dynamical version of critical FPP on the triangular lattice where vertices resample their weights according to independent rate-one Poisson processes. We will discuss results which show that if sum of F^{-1}(1/2+1/2^k) diverges, then a.s. there are exceptional times at which the weight grows atypically, but if sum of k^{7/8} F^{-1}(1/2+1/2^k) converges, then a.s. there are no such times. Furthermore, in the former case, we compute the Hausdorff and Minkowski dimensions of the exceptional set and show that they can be but need not be equal. These results show a wider range of dynamical behavior than one sees in subcritical (usual) FPP. This is a joint work with M. Damron, J. Hanson, W.-K. Lam.

This talk will be given on Bluejeans at the link https://bluejeans.com/283104959/2281

Low-rank Structured Data Analysis: Methods, Models and Algorithms

Series
Job Candidate Talk
Time
Tuesday, February 22, 2022 - 11:00 for 1 hour (actually 50 minutes)
Location
https://bluejeans.com/717545499/6211
Speaker
Longxiu HuangUCLA

In modern data analysis, the datasets are often represented by large-scale matrices or tensors (the generalization of matrices to higher dimensions). To have a better understanding or extract   values effectively from these data, an important step is to construct a low-dimensional/compressed representation of the data that may be better to analyze and interpret in light of a corpus of field-specific information. To implement the goal, a primary tool is the matrix/tensor decomposition. In this talk, I will talk about novel matrix/tensor decompositions, CUR decompositions, which are memory efficient and computationally cheap. Besides, I will also discuss the applications of CUR decompositions on developing efficient algorithms or models to robust decompositions or data completion problems. Additionally, some simulation results will be provided on real and synthetic datasets. 

Pages