Seminars and Colloquia by Series

Shot Noise Process

Series
Stochastics Seminar
Time
Thursday, March 5, 2009 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 269
Speaker
Yuanhui XiaoDepartment of Mathematics and Statistics, Georgia State University
A shot noise process is essentially a compound Poisson process whereby the arriving shots are allowed to accumulate or decay after their arrival via some preset shot (impulse response) function. Shot noise models see applications in diverse areas such as insurance, fi- nance, hydrology, textile engineering, and electronics. This talk stud- ies several statistical inference issues for shot noise processes. Under mild conditions, ergodicity is proven in that process sample paths sat- isfy a strong law of large numbers and central limit theorem. These results have application in storage modeling. Shot function parameter estimation from a data history observed on a discrete-time lattice is then explored. Optimal estimating functions are tractable when the shot function satisfies a so-called interval similar condition. Moment methods of estimation are easily applicable if the shot function is com- pactly supported and show good performance. In all cases, asymptotic normality of the proposed estimators is established.

Scenery reconstruction part II

Series
Stochastics Seminar
Time
Thursday, February 26, 2009 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 269
Speaker
Henri MatzingerSchool of Mathematics, Georgia Tech
Last week we saw combinatorial reconstruction. This time we are going to explain a new approach to Scenery Reconstruction. This new approach could allow us to prove that being able to distinguish sceneries implies reconstructability.

Optimal alignments and sceneries

Series
Stochastics Seminar
Time
Thursday, February 19, 2009 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 269
Speaker
Heinrich MatzingerSchool of Mathematics, Georgai Tech
We explore the connection between Scenery Reconstruction and Optimal Alignments. We present some new algorithms which work in practise and not just in theory, to solve the Scenery Reconstruction problem

On creating a model assessment tool independent of data size and estimating the U statistic variance

Series
Stochastics Seminar
Time
Thursday, February 12, 2009 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 269
Speaker
Jiawei LiuDepartment of Mathematics & Statistics, Georgia State University
If viewed realistically, models under consideration are always false. A consequence of model falseness is that for every data generating mechanism, there exists a sample size at which the model failure will become obvious. There are occasions when one will still want to use a false model, provided that it gives a parsimonious and powerful description of the generating mechanism. We introduced a model credibility index, from the point of view that the model is false. The model credibility index is defined as the maximum sample size at which samples from the model and those from the true data generating mechanism are nearly indistinguishable. Estimating the model credibility index is under the framework of subsampling, where a large data set is treated as our population, subsamples are generated from the population and compared with the model using various sample sizes. Exploring the asymptotic properties of the model credibility index is associated with the problem of estimating variance of U statistics. An unbiased estimator and a simple fix-up are proposed to estimate the U statistic variance.

Random trees and SPDE approximation

Series
Stochastics Seminar
Time
Thursday, January 29, 2009 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 269
Speaker
Yuri BakhtinSchool of Mathematics, Georgia Tech
This work began in collaboration with C.Heitsch. I will briefly discuss the biological motivation. Then I will introduce Gibbs random trees and study their asymptotics as the tree size grows to infinity. One of the results is a "thermodynamic limit" allowing to introduce a limiting infinite random tree which exhibits a few curious properties. Under appropriate scaling one can obtain a diffusion limit for the process of generation sizes of the infinite tree. It also turns out that one can approach the study the details of the geometry of the tree by tracing progenies of subpopulations. Under the same scaling the limiting continuum random tree can be described as a solution of an SPDE w.r.t. a Brownian sheet.

Conditions of the uniform convergence of empirical averages to their expectations

Series
Stochastics Seminar
Time
Thursday, January 15, 2009 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 269
Speaker
Alexey ChervonenkisRussian Academy of Sciences and Royal Holloway University of London
The uniform convergence of empirical averages to their expectations for a set of bounded test functions will be discussed. In our previous work, we proved a necessary and sufficient condition for the uniform convergence that can be formulated in terms of the epsilon-entropy of certain sets associated to the sample. In this talk, I will consider the case where that condition is violated. The main result is that in this situation strong almost sure oscillations take place. In fact, with probability one, for a given oscillation pattern, one can find an admissible test function that realizes this pattern for any positive prescribed precision level.

An Approach to the Gaussian Correlation Conjecture

Series
Stochastics Seminar
Time
Friday, December 12, 2008 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 269
Speaker
Joel ZinnTexas A&M University
In this approach to the Gaussian Correlation Conjecture we must check the log-concavity of the moment generating function of certain measures pulled down by a particular Gaussian density.

Dunkl processes, eigenvalues of random matrices and the Weyl-chamber

Series
Stochastics Seminar
Time
Tuesday, November 25, 2008 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 269
Speaker
Nizar DemniUniversity of Bielefeld
We will introduce the Dunkl derivative as well as the Dunkl process and some of its properties. We will treat its radial part called the radial Dunkl process and light the connection to the eigenvalues of some matrix valued processes and to the so called Brownian motions in Weyl chambers. Some open problems will be discussed at the end.

Smoothed Weighted Empirical Likelihood Ratio Confidence Intervals for Quantiles

Series
Stochastics Seminar
Time
Thursday, November 20, 2008 - 15:00 for 1 hour (actually 50 minutes)
Location
Skiles 269
Speaker
Jian-Jian RenDepartment of Mathematics, University of Central Florida
So far, likelihood-based interval estimate for quantiles has not been studied in literature for interval censored Case 2 data and partly interval-censored data, and in this context the use of smoothing has not been considered for any type of censored data. This article constructs smoothed weighted empirical likelihood ratio confidence intervals (WELRCI) for quantiles in a unified framework for various types of censored data, including right censored data, doubly censored data, interval censored data and partly interval-censored data. The 4th-order expansion of the weighted empirical log-likelihood ratio is derived, and the 'theoretical' coverage accuracy equation for the proposed WELRCI is established, which generally guarantees at least the 'first-order' accuracy. In particular for right censored data, we show that the coverage accuracy is at least O(n^{-1/2}), and our simulation studies show that in comparison with empirical likelihood-based methods, the smoothing used in WELRCI generally gives a shorter confidence interval with comparable coverage accuracy. For interval censored data, it is interesting to find that with an adjusted rate n^{-1/3}, the weighted empirical log-likelihood ratio has an asymptotic distribution completely different from that by the empirical likelihood approach, and the resulting WELRCI perform favorably in available comparison simulation studies.

Computing Junction Forests from Filtrations of Simplicial Complexes

Series
Stochastics Seminar
Time
Friday, November 14, 2008 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 255
Speaker
Sayan MukherjeeDepartment of Statistical Science, Duke University
Let X=(X_1,\ldots,X_n) be a n-dimensional random vector for which the distribution has Markov structure corresponding to a junction forest, assuming functional forms for the marginal distributions associated with the cliques of the underlying graph. We propose a latent variable approach based on computing junction forests from filtrations. This methodology establishes connections between efficient algorithms from Computational Topology and Graphical Models, which lead to parametrizations for the space of decomposable graphs so that: i) the dimension grows linearly with respect to n, ii) they are convenient for MCMC sampling.

Pages