Seminars and Colloquia Schedule

Learning geometry from incomplete pairwise distances: Theory, algorithms and applications

Series
Applied and Computational Mathematics Seminar
Time
Monday, January 12, 2026 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 254 and https://gatech.zoom.us/j/94954654170
Speaker
Abiy TasissaTufts

The advancement of technology has significantly enhanced our capacity to collect data. However, in many real-world applications, certain inherent limitations, such as the precision of measurement devices, environmental conditions, or operating costs, can result in missing data. In this talk, we focus on the setting where the available data consists of pairwise distances between a set of points, with the goal of estimating the configuration of the underlying geometry from incomplete distance measurements. This is known as the Euclidean distance geometry (EDG) problem and is central to many applications.

We first start by describing the solution when all distances are given using the classical multidimensional scaling (MDS) technique and then discuss a constructive approach to interpret the key mathematical objects in MDS. Next, we introduce a mathematical framework to address the EDG problem under two sampling models of the distance matrix: global sampling (uniform sampling of the entries of the distance matrix) and structured local sampling, where the measurements are limited to a subset of rows and columns. We discuss the conditions required for the exact recovery of the point configuration and the associated algorithms. The last part of the talk will illustrate the algorithms using synthetic and real data and discuss ongoing work.

Some upper and lower bounds on the variance of functions of independent random variables

Series
Probability Working Seminar
Time
Tuesday, January 13, 2026 - 15:30 for 1.5 hours (actually 80 minutes)
Location
Skiles 006
Speaker
Christian HoudréGeorgia Institute of Technology

First of several talks.

I'll present various methods, some old, some new,  leading to estimates on the variance of $f(X_1, X_2, \dots, X_n)$ where  

$X_1, X_2, \dots, X_n$ are independent random variables.  These methods will be illustrated with various examples.

Maximization of recurrent sequences, Schur positivity, and derivative bounds in Lagrange interpolation

Series
Analysis Seminar
Time
Wednesday, January 14, 2026 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Dmitrii OstrovskiiGeorgia Institute of Technology

Consider the following extremal problem: maximize the amplitude |X_T|, at time T, of a linear recurrent sequence X_1, X_2,... of order N < T, under natural constraints: (I) the initials are uniformly bounded; (II) the characteristic polynomial is R-stable, i.e., its roots are in the origin-centered disc of radius R. While the maximum at time T = N essentially follows from the classical Gautschi bound (1960), the general case T > N turns out to be way more challenging to handle. We find that for any triple (N,R,T), the amplitude is maximized when the roots coincide and have modulus R, and the initials are chosen to align the phases of fundamental solutions. This result is striking for two reasons. First, the same configuration of roots and initials is uniformly optimal for all T, i.e. the whole envelope is maximized at once. Second, we are not aware of any purely analytical proof: ours uses tools from algebraic combinatorics, namely Schur polynomials indexed by hook partitions. 

In the talk, I will sketch the proof of this result, making it as self-sufficient as possible under the circumstances. If time permits, we will discuss a related conjecture on the optimal error bounds in complex Lagrange interpolation.

The talk is based on the work https://arxiv.org/abs/2508.13554.

How trustworthy AI enables a paradigm shift in classical statistics for particle physics

Series
Stochastics Seminar
Time
Thursday, January 15, 2026 - 15:30 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Aishik GhoshGeorgia Tech

Particle physics research relies on making statistical statements about Nature. The field is one of the last bastions of classical statistics and certainly among its most rigorous users, relying on a worldwide computing grid to process zettabyte-scale data. Recent AI-enabled developments have reinvigorated research in classical statistics, particularly by removing the need for asymptotic approximations in many calculations.

 

In this talk, I will discuss how AI has allowed us to question core assumptions in our statistical inference techniques. Neural networks enable high-dimensional statistical inference, avoiding aggressive data reduction or the use of unnecessary assumptions. However, they also introduce new sources of systematic uncertainty that require novel uncertainty quantification tools. AI further enables more robust statistical inference by accelerating Neyman inversion and confidence-interval calibration. These advances allow the design of new test statistics that leverage Bayesian mathematical tools while still guaranteeing frequentist coverage, an approach that was previously considered computationally infeasible. These new techniques raise questions about practical methods for handling nuisance parameters, the definition of point estimators, and the computationally efficient implementation of mathematical solutions. If time permits, I will also introduce the emerging challenge of non-nestable hypothesis testing in particle physics.

 

My group is among the teams leading this revitalization of classical statistical research in particle physics, and I look forward to connecting with students and senior colleagues at Georgia Tech who are interested in contributing to this emerging field.

 

Bio: Aishik Ghosh is an assistant professor in the School of Physics at Georgia Tech with a focus on developing AI methods to accelerate fundamental physics and astrophysics. His group works on theoretical physics, statistical methods, and experiment design. For robust scientific applications, Dr. Ghosh focuses on uncertainty quantification, interpretability, and verifiability of AI algorithms, targeting publications in physics journals and ML conferences.