Seminars and Colloquia Schedule

Effective Chabauty for Sym^2

Series
Algebra Seminar
Time
Monday, April 29, 2013 - 15:05 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Jennifer ParkMIT
While we know by Faltings' theorem that curves of genus at least 2 have finitely many rational points, his theorem is not effective. In 1985, R. Coleman showed that Chabauty's method, which works when the Mordell-Weil rank of the Jacobian of the curve is small, can be used to give a good effective bound on the number of rational points of curves of genus g > 1. In this talk, we draw ideas from tropical geometry to show that we can also give an effective bound on the number of rational points of Sym^2(X) that are not parametrized by a projective line or an elliptic curve, where X is a (hyperelliptic) curve of genus g > 2, when the Mordell-Weil rank of the Jacobian of the curve is at most g-2.

Logarithmic Sobolev inequalities and strong data processing theorems for discrete channels

Series
Other Talks
Time
Monday, April 29, 2013 - 15:05 for 1 hour (actually 50 minutes)
Location
Klaus 1116W
Speaker
Maxim RaginskyUniversity of Illinois, Urbana-Champaign
The problem of quantifying the amount of information loss due to a random transformation (or a noisy channel) arises in a variety of contexts, such as machine learning, stochastic simulation, error-correcting codes, or computation in circuits with noisy gates, to name just a few. This talk will focus on discrete channels, where both the input and output sets are finite. The noisiness of a discrete channel can be measured by comparing suitable functionals of the input and output distributions. For instance, if we fix a reference input distribution, then the worst-case ratio of output relative entropy (Kullback-Leibler divergence) to input relative entropy for any other input distribution is bounded by one, by the data processing theorem. However, for a fixed reference input distribution, this quantity may be strictly smaller than one, giving so-called strong data processing inequalities (SDPIs). I will show that the problem of determining both the best constant in an SDPI and any input distributions that achieve it can be addressed using logarithmic Sobolev inequalities, which relate input relative entropy to certain measures of input-output correlation. I will also show that SDPIs for Kullback-Leibler divergence arises as a limiting case of a family of SDPIs for Renyi divergence, and discuss the relationship to hypercontraction of Markov operators.

Lp theory for outer measures

Series
Analysis Seminar
Time
Wednesday, May 1, 2013 - 10:07 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Yen DoYale University
In this talk I will describe an Lp theory for outer measures, which could be used to connect two themes of Lennart Carleson's work: Carleson measures and time frequency analysis. This is joint work with Christoph Thiele.