Polynomial Decompositions in Machine Learning

Series
Algebra Seminar
Time
Monday, April 22, 2019 - 12:50pm for 1 hour (actually 50 minutes)
Location
Skiles 005
Speaker
Joe Kileel – Princeton University – jkileel@math.princeton.edu
Organizer
Justin Chen

This talk will be about polynomial decompositions that are relevant in machine learning.  I will start with the well-known low-rank symmetric tensor decomposition, and present a simple new algorithm with local convergence guarantees, which seems to handily outperform the state-of-the-art in experiments.  Next I will consider a particular generalization of symmetric tensor decomposition, and apply this to estimate subspace arrangements from very many, very noisy samples (a regime in which current subspace clustering algorithms break down).  Finally I will switch gears and discuss representability of polynomials by deep neural networks with polynomial activations.  The various polynomial decompositions in this talk motivate questions in commutative algebra, computational algebraic geometry and optimization.  The first part of this talk is joint with Emmanuel Abbe, Tamir Bendory, Joao Pereira and Amit Singer, while the latter part is joint with Matthew Trager.