Seminars and Colloquia by Series

Friday, January 16, 2015 - 13:00 , Location: Skiles 005 , Peter Winkler , Dartmouth College , Organizer:

Joint ARC colloquium/ACO student seminar

Pursuit games---motivated historically by military tactics---are a natural for graphical settings, and take many forms. We will present some recent results involving (among other things) drunks, Kakeya sets and a "ketchup graph.'' Lastly, we describe what we think is the most important open problem in the field.
Friday, October 10, 2014 - 13:00 , Location: Skiles 005 , Robert Krone , Georgia Tech , krone@math.gatech.edu , Organizer:
Given a family of ideals which are symmetric under some group action on the variables, a natural question to ask is whether the generating set stabilizes up to symmetry as the number of variables tends to infinity. We answer this in the affirmative for a broad class of toric ideals, settling several open questions coming from algebraic statistics. Our approach involves factoring an equivariant monomial map into a part for which we have an explicit degree bound of the kernel, and a part for which we canprove that the source, a so-called matching monoid, is equivariantly Noetherian. The proof is mostly combinatorial, making use of the theory of well-partial orders and its relationship to Noetherianity of monoid rings. Joint work with Jan Draisma, Rob Eggermont, and Anton Leykin.
Friday, April 18, 2014 - 12:05 , Location: Skiles 005 , Sara Krehbiel , Georgia Tech , Organizer:
Database privacy has garnered a recent surge in interest from the theoretical science community following the seminal work of Dwork 2006, which proposed the strong notion of differential privacy. In this setting, each row of a database corresponds to the data owned by some (distinct) individual.  An analyst submits a database query to a differentially private mechanism, which replies with a noisy answer guaranteeing privacy for the data owners and accuracy for the analyst. The mechanism's privacy parameter \epsilon is correlated negatively with privacy and positively with accuracy.This work builds a framework for creating and analyzing a market that 1) solves for some socially efficient value of \epsilon using the privacy and accuracy preferences of a heterogeneous body of data owners and a single analyst, 2) computes a noisy statistic on the database, and 3) collects and distributes payments for privacy that elicit truthful reporting of data owners' preferences. We present a market for database privacy in this new framework expanding on the public goods market of Groves and Ledyard, 1977.
Friday, April 4, 2014 - 12:05 , Location: Skiles 005 , Ben Cousins , Georgia Tech , Organizer:
We present randomized algorithms for sampling the standard Gaussian distribution restricted to a convex set and for estimating the Gaussian measure of a convex set, in the general membership oracle model. The complexity of the integration algorithm is O*(n^3) while the complexity of the sampling algorithm is O*(n^3) for the first sample and O*(n^2) for every subsequent sample. These bounds improve on the corresponding state-of-the-art by a factor of n. Our improvement comes from several aspects: better isoperimetry, smoother annealing, avoiding transformation to isotropic position and the use of the ``speedy walk" in the analysis.
Friday, March 28, 2014 - 12:05 , Location: Skiles 005 , Ioannis Panageas , Georgia Tech , Organizer:
Since the 50s and Nash general proof of equilibrium existence in games it is well understood that even simple games may have many, even uncountably infinite, equilibria with different properties. In such cases a natural question arises, which equilibrium is the right one? In this work, we perform average case analysis of evolutionary dynamics in such cases of games. Intuitively, we assign to each equilibrium a probability mass that is proportional to the size of its region of attraction. We develop new techniques to compute these likelihoods for classic games such as the Stag Hunt game (and generalizations) as well as balls-bins games. Our proofs combine techniques from information theory (relative entropy), dynamical systems (center manifold theorem), and algorithmic game theory. Joint work with Georgios Piliouras
Friday, November 22, 2013 - 13:05 , Location: Skiles 005 , Cristóbal Guzmán , ISyE, Georgia Tech , cguzman@gatech.edu , Organizer:
First-order (a.k.a. subgradient) methods in convex optimization are a popular choice when facing extremely large-scale problems, where medium accuracy solutions suffice. The limits of performance of first-order methods can be partially understood under the lens of black box oracle complexity. In this talk I will present some of the limitations of worst-case black box oracle complexity, and I will show two recent extensions of the theory:                                                                                                          First, we extend the notion of oracle compexity to the distributional setting, where complexity is measured as the worst average running time of (deterministic) algorithms against a distribution of instances. In this model, the distribution of instances is part of the input to the algorithm, and thus algorithms can potentially exploit this to accelerate their running time. However, we will show that for nonsmooth convex optimization distributional lower bounds coincide to worst-case complexity up to a constant factor, and thus all notions of complexity collapse; we can further extend these lower bounds to prove high running time with high probability (this is joint work with Sebastian Pokutta and Gabor Braun). Second, we extend the worst-case lower bounds for smooth convex optimization to non-Euclidean settings. Our construction mimics the classical proof for the nonsmooth case (based on piecewise-linear functions), but with a local smoothening of the instances. We establish a general lower bound for a wide class of finite dimensional Banach spaces, and then apply the results to \ell^p spaces, for p\in[2,\infty]. A further reduction will allow us to extend the lower bounds to p\in[1,2). As consequences, we prove the near-optimality of the Frank-Wolfe algorithm for the box and the spectral norm ball; and we prove near-optimality of function classes that contain the standard convex relaxation for the sparse recovery problem (this is joint work with Arkadi Nemirovski).
Friday, November 8, 2013 - 13:05 , Location: Skiles 005 , Gustavo Angulo , ISyE, Georgia Tech , Organizer:
In this talk, we introduce and study the forbidden-vertices problem. Given a polytope P and a subset X of its vertices, we study the complexity of linear optimization over the subset of vertices of P that are not contained in X. This problem is closely related to finding the k-best basic solutions to a linear problem. We show that the complexity of the problem changes significantly depending on how both P and X are described, that is, on the encoding of the input data. For example, we analyze the case where the complete linear formulation of P is provided, as opposed to the case where P is given by an implicit description (to be defined in the talk). When P has binary vertices only, we provide additional tractability results and linear formulations of polynomial size. Some applications and extensions to integral polytopes will be discussed. Joint work with Shabbir Ahmed, Santanu S. Dey, and Volker Kaibel.
Friday, November 1, 2013 - 13:05 , Location: Skiles 005 , Yingyu Liang , College of Computing, Georgia Tech , Organizer:
Recently, Bilu and Linial formalized an implicit assumption often made when choosing a clustering objective: that the optimum clustering to the objective should be preserved under small multiplicative perturbations to distances between points. They showed that for max-cut clustering it is possible to circumvent NP-hardness and obtain polynomial-time algorithms for instances resilient to large (factor O(\sqrt{n})) perturbations, and subsequently Awasthi et al. considered center-based objectives, giving algorithms for instances resilient to O(1) factor perturbations. In this talk, for center-based objectives, we present an algorithm that can optimally cluster instances resilient to (1+\sqrt{2})-factor perturbations, solving an open problem of Awasthi et al. For k-median, a center-based objective of special interest, we additionally give algorithms for a more relaxed assumption in which we allow the optimal solution to change in a small fraction of the points after perturbation. We give the first bounds known for k-median under this more realistic and more general assumption. We also provide positive results for min-sum clustering which is a generally much harder objective than center-based objectives. Our algorithms are based on new linkage criteria that may be of independent interest.
Friday, October 25, 2013 - 13:05 , Location: Skiles 005 , Ton Dieker , ISyE, Georgia Tech , Organizer:
This talk evolves around Markov functions, i.e., when a function of a Markov chain results in another Markov chain. We focus on two examples where this concept yields new results and insights: (1) the evolution of reflected stochastic processes in the study of stochastic networks, and (2) spectral analysis for a special high-dimensional Markov chain.
Friday, October 11, 2013 - 13:05 , Location: Skiles 005 , Jugal Garg , College of Computing, Georgia Tech , Organizer:
Although production is an integral part of the Arrow-Debreu market model, most of the work in theoretical computer science has so far concentrated on markets without production, i.e., the exchange economy. In this work, we take a significant step towards understanding computational aspects of markets with production. We first define the notion of separable, piecewise-linear concave (SPLC) production by analogy with SPLC utility functions. We then obtain a linear complementarity problem (LCP) formulation that captures exactly the set of equilibria for Arrow-Debreu markets with SPLC utilities and SPLC production, and we give a complementary pivot algorithm for finding an equilibrium. This settles a question asked by Eaves in 1975.  Since this is a path-following algorithm, we obtain a proof of membership of this problem in PPAD, using Todd, 1976. We also obtain an elementary proof of existence of equilibrium (i.e., without using a fixed point theorem), rationality, and oddness of the number of equilibria.  Experiments show that our algorithm runs fast on randomly chosen examples, and unlike previous approaches, it does not suffer from issues of numerical instability. Additionally, it is strongly polynomial when the number of goods or the number of agents and firms is constant. This extends the result of Devanur and Kannan (2008) to markets with production. Based on a joint work with Vijay V. Vazirani. 

Pages