Seminars and Colloquia by Series

Joint DOS/ACO Seminar - The reflex algorithm - Convex optimization by random reflection

Series
Other Talks
Time
Wednesday, March 17, 2010 - 13:30 for 1 hour (actually 50 minutes)
Location
ISyE Executive Classroom
Speaker
Merrick FurstCollege of Computing, Georgia Tech
Santosh Vempala and I have been exploring an intriguing new approach to convex optimization. Intuition about first-order interior point methods tells us that a main impediment to quickly finding an inside track to optimal is that a convex body's boundary can get in one's way in so many directions from so many places. If the surface of a convex body is made to be perfectly reflecting then from every interior vantage point it essentially disappears. Wondering about what this might mean for designing a new type of first-order interior point method, a preliminary analysis offers a surprising and suggestive result. Scale a convex body a sufficient amount in the direction of optimization. Mirror its surface and look directly upwards from anywhere. Then, in the distance, you will see a point that is as close as desired to optimal. We wouldn't recommend a direct implementation, since it doesn't work in practice. However, by trial and error we have developed a new algorithm for convex optimization, which we are calling Reflex. Reflex alternates greedy random reflecting steps, that can get stuck in narrow reflecting corridors, with simply-biased random reflecting steps that escape. We have early experimental experience using a first implementation of Reflex, implemented in Matlab, solving LP's (can be faster than Matlab's linprog), SDP's (dense with several thousand variables), quadratic cone problems, and some standard NETLIB problems.

The reflex algorithm - Convex optimization by random reflection

Series
ACO Student Seminar
Time
Wednesday, March 17, 2010 - 13:30 for 1 hour (actually 50 minutes)
Location
ISyE Executive Classroom
Speaker
Prof. Merrick FurstComputer Science, Georgia Tech
Santosh Vempala and I have been exploring an intriguing newapproach to convex optimization. Intuition about first-order interiorpoint methods tells us that a main impediment to quickly finding aninside track to optimal is that a convex body's boundary can get inone's way in so many directions from so many places. If the surface ofa convex body is made to be perfectly reflecting then from everyinterior vantage point it essentially disappears. Wondering about whatthis might mean for designing a new type of first-order interior pointmethod, a preliminary analysis offers a surprising and suggestiveresult. Scale a convex body a sufficient amount in the direction ofoptimization. Mirror its surface and look directly upwards fromanywhere. Then, in the distance, you will see a point that is as closeas desired to optimal. We wouldn't recommend a direct implementation,since it doesn't work in practice. However, by trial and error we havedeveloped a new algorithm for convex optimization, which we arecalling Reflex. Reflex alternates greedy random reflecting steps, thatcan get stuck in narrow reflecting corridors, with simply-biasedrandom reflecting steps that escape. We have early experimentalexperience using a first implementation of Reflex, implemented inMatlab, solving LP's (can be faster than Matlab's linprog), SDP's(dense with several thousand variables), quadratic cone problems, andsome standard NETLIB problems.

Town Hall Meeting of the Graduate Students and Graduate Coordinator

Series
Research Horizons Seminar
Time
Tuesday, March 16, 2010 - 12:00 for 1 hour (actually 50 minutes)
Location
Skiles 255
Speaker
Luca DieciProfessor and Graduate Coordinator, School of Mathematics

Please Note: Hosted by: Huy Huynh and Yao Li

We will have a chance to spend some time together to discuss issues of relevance to the Graduate Program. Sort of like a "Town Hall Meeting" of the graduate students and the graduate coordinator. There are some things that I need to communicate to all of you, but the format is otherwise unstructured, and I am seeking suggestions on things which you would like to see addressed. So, please send me comments on things which you would like to see discussed and I will do my best to get ready for them. Thanks, Luca Dieci.

On a parametrization of positive semidefinite matrices

Series
Algebra Seminar
Time
Monday, March 15, 2010 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 171
Speaker
Josephine YuGeorgia Tech
We study a class of parametrizations of convex cones of positive semidefinite matrices with prescribed zeros. Each such cone corresponds to a graph whose non-edges determine the prescribed zeros. Each parametrization in the class is a polynomial map associated with a simplicial complex comprising cliques of the graph. The images of the maps are convex cones, and the maps can only be surjective onto the cone of zero-constrained positive semidefinite matrices when the associated graph is chordal. Our main result gives a semi-algebraic description of the image of the parametrizations for chordless cycles. The work is motivated by the fact that the considered maps correspond to Gaussian statistical models with hidden variables. This is joint work with Mathias Drton.

CANCELLED

Series
Applied and Computational Mathematics Seminar
Time
Monday, March 15, 2010 - 13:00 for 1 hour (actually 50 minutes)
Location
Skiles 255
Speaker
Maria CameronCourant Institute, NYU
The overdamped Langevin equation is often used as a model in molecular dynamics. At low temperatures, a system evolving according to such an SDE spends most of the time near the potential minima and performs rare transitions between them. A number of methods have been developed to study the most likely transition paths. I will focus on one of them: the MaxFlux functional.The MaxFlux functional has been around for almost thirty years but not widely used because it is challenging to minimize. Its minimizer provides a path along which the reactive flux is maximal at a given finite temperature. I will show two ways to derive it in the framework of transition path theory: the lower bound approach and the geometrical approach. I will present an efficient way to minimize the MaxFlux functional numerically. I will demonstrate its application to the problem of finding the most likely transition paths in the Lennard-Jones-38 cluster between the face-centered-cubic and icosahedral structures.

Turing patterns and standing waves of FitzHugh-Nagumo type systems

Series
CDSNS Colloquium
Time
Monday, March 15, 2010 - 11:00 for 1 hour (actually 50 minutes)
Location
Skiles 269
Speaker
Chao-Nien ChenNational Changhua University, Taiwan
There are many interesting patterns observed in activator-inhibitor systems. A well-known model is the FitzHugh-Nagumo system. In conjunction with calculus of variations, there is a close relation between the stability of a steady state and its relative Morse index. We give a sufficient condition in diffusivity for the existence of standing wavefronts joining with Turing patterns.

PI DAY!!

Series
Other Talks
Time
Sunday, March 14, 2010 - 13:59 for 3 hours
Location
Skiles Courtyard
Speaker
N/AGT
Come celebrate pi day with math club! Pot-luck, so bring food! Math club will be providing the pies, so we ask that everyone else try to bring more substantial food. ;)Bring any games and such you want as well.

Introduction to Khovanov Homology

Series
Geometry Topology Working Seminar
Time
Friday, March 12, 2010 - 14:00 for 1 hour (actually 50 minutes)
Location
Skiles 269
Speaker
Alan DiazGeorgia Tech
Khovanov homology is an invariant of oriented links, that is defined as the cohomology of a chain complex built from the cube of resolutions of a link diagram. Discovered in the late 90s, it is the first of, and inspiration for, a series of "categorifications" of knot invariants. In this first of two one-hour talks, I'll give some background on categorification and the Jones polynomial, defineKhovanov homology, work through some examples, and give a portion of the proof of Reidemeister invariance.

Sparsity in machine learning: recovery in convex hulls of infinite dictionaries

Series
SIAM Student Seminar
Time
Friday, March 12, 2010 - 13:00 for 1 hour (actually 50 minutes)
Location
Skiles 255
Speaker
Stanislav MinskerSchool of Mathematics, Georgia Tech
We will start with a brief introduction to the broad area of machine learning, with the focus on empirical risk minimization methods and their connection to the theory of empirical processes. Using some results from our recent work with V. Koltchinskii, I will explain how sparsity affects the risk bounds.

Pages