Seminars and Colloquia by Series

Domain decomposition methods for large problems of elasticity

Series
Applied and Computational Mathematics Seminar
Time
Monday, November 14, 2011 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Olof Widlund Courant Institute,New York University, Mathematics and Computer Science
The domain decomposition methods considered are preconditioned conjugate gradient methods designed for the very large algebraic systems of equations which often arise in finite element practice. They are designed for massively parallel computer systems and the preconditioners are built from solvers on the substructures into whichthe domain of the given problem is partitioned. In addition, to obtain scalability, there must be a coarse problem, with a small number of degrees of freedom for each substructure. The design of this coarse problem is crucial for obtaining rapidly convergent iterations and poses the most interesting challenge in the analysis.Our work will be illustrated by overlapping Schwarz methods for almost incompressible elasticity approximated by mixed finite element and mixed spectral element methods. These algorithms is now used extensively at the SANDIA, Albuquerque laboratories and were developed in close collaboration with Dr. Clark R. Dohrmann. These results illustrate two roles of the coarse component of the preconditioner.Currently, these algorithms are being actively developed for problems posed in H(curl) and H(div). This work requires the development of new coarse spaces. We will also comment on recent work on extending domain decomposition theory to subdomains with quite irregular boundaries.  This work is relevant because of the use of mesh partitioners in the decomposition of large finite element matrices. 

Examples of negatively curved manifolds (after Ontaneda)

Series
Geometry Topology Working Seminar
Time
Friday, November 11, 2011 - 14:05 for 2 hours
Location
Skiles 006
Speaker
Igor BelegradekGeorgia Tech
This is the second in the series of two talks aimed to discuss a recent work of Ontaneda which gives a poweful method of producing negatively curved manifolds. Ontaneda's work adds a lot of weight to the often quoted Gromov's prediction that in a sense most manifolds (of any dimension) are negatively curved. In the second talk I shall discuss some ideas of the proof.

ARC Theory Day

Series
Other Talks
Time
Friday, November 11, 2011 - 09:20 for 1 hour (actually 50 minutes)
Location
Klaus 1116 E&W
Speaker
ARC Theory DayAlgorithms and Randomness Center, Georgia Tech
Algorithms and Randomness Center (ARC) Theory Day is an annual event, to showcase lectures on recent exciting developments in theoretical computer science. This year's inaugural event features four young speakers who have made such valuable contributions to the field. In addition, this year we are fortunate to have Avi Wigderson from the Institute for Advanced Study (Princeton) speak on fundamental questions and progress in computational complexity to a general audience. See the complete list of titles and times of talks.

The complete mixability and its applications

Series
Stochastics Seminar
Time
Thursday, November 10, 2011 - 15:05 for 1 hour (actually 50 minutes)
Location
skyles 006
Speaker
Ruodu WangSchool of mathematics, Georgia institute of Technology
The marginal distribution of identically distributed random variables having a constant sum is called a completely mixable distribution. In this talk, the concept, history and present research of the complete mixability will be introduced. I will discuss its relevance to existing problems in the Frechet class, i.e. problems with known marginal distributions but unknown joint distribution and its applications in quantitative risk management.

The power and weakness of randomness (when you are short on time)

Series
School of Mathematics Colloquium
Time
Thursday, November 10, 2011 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Avi WigdersonSchool of Mathematics, Institute for Advanced Study

Please Note: This is a joint ARC-SoM colloquium, and is in conjunction with the ARC Theory Day on November 11, 2011

Man has grappled with the meaning and utility of randomness for centuries. Research in the Theory of Computation in the last thirty years has enriched this study considerably. I'll describe two main aspects of this research on randomness, demonstrating respectively its power and weakness for making algorithms faster. I will address the role of randomness in other computational settings, such as space bounded computation and probabilistic and zero-knowledge proofs.

Chromatic Derivatives and Approximations Speaker

Series
Analysis Seminar
Time
Wednesday, November 9, 2011 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 006
Speaker
Aleks IgnjatovicUniversity of New South Wales
Chromatic derivatives are special, numerically robust linear differential operators which provide a unification framework for a broad class of orthogonal polynomials with a broad class of special functions. They are used to define chromatic expansions which generalize the Neumann series of Bessel functions. Such expansions are motivated by signal processing; they grew out of a design of a switch mode power amplifier. Chromatic expansions provide local signal representation complementary to the global signal representation given by the Shannon sampling expansion. Unlike the Taylor expansion which they are intended to replace, they share all the properties of the Shannon expansion which are crucial for signal processing. Besides being a promising new tool for signal processing, chromatic derivatives and expansions have intriguing mathematical properties connecting in a novel way orthogonal polynomials with some familiar concepts and theorems of harmonic analysis. For example, they introduce novel spaces of almost periodic functions which naturally correspond to a broad class of families of orthogonal polynomials containing most classical families. We also present a conjecture which generalizes the Paley Wiener Theorem and which relates the growth rate of entire functions with the asymptotic behavior of the recursion coefficients of a corresponding family of orthogonal polynomials.

Viscosity solutions and applications to stochastic optimal control.

Series
Research Horizons Seminar
Time
Wednesday, November 9, 2011 - 12:05 for 1 hour (actually 50 minutes)
Location
Skiles 005.
Speaker
Andrzej SwiechGeorgia Tech.
I will give a brief introduction to the theory ofviscosity solutions of second order PDE. In particular, I will discussHamilton-Jacobi-Bellman-Isaacs equations and their connections withstochastic optimal control and stochastic differentialgames problems. I will also present extensions of viscositysolutions to integro-PDE.

Discrimination of binary patterns by perceptrons with binary weights

Series
Mathematical Biology Seminar
Time
Wednesday, November 9, 2011 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Andrei OliferGeorgia Gwinnett College
Information processing in neurons and their networks is understood incompletely, especially when neuronal inputs have indirect correlates with external stimuli as for example in the hippocampus. We study a case when all neurons in one network receive inputs from another network within a short time window. We consider it as a mapping of binary vectors of spiking activity ("spike" or "no spike") in an input network to binary vectors of spiking activity in the output network. Intuitively, if an input pattern makes a neuron spike then the neuron should also spike in response to similar patterns - otherwise, neurons would be too sensitive to noise. On the other hand, neurons should discriminate between sufficiently different input patterns and spike selectively. Our main goal was to quantify how well neurons discriminate input patterns depending on connectivity between networks, spiking threshold of neurons and other parameters. We modeled neurons with perceptrons that have binary weights. Most recent results on perceptron neuronal models are asymptotic with respect to some parameters. Here, using combinatorial analysis, we complement them by exact formulas. Those formulas in particular predict that the number of the inputs per neuron maximizes the difference between the neuronal and network responses to similar and distinct inputs. A joint work with Jean Vaillant (UAG).

Pages