Prof. Emo Welzl and Prof. Bernd Gärtner
|Mittagsseminar Talk Information|
Date and Time: Thursday, January 12, 2017, 12:15 pm
Duration: 45 minutes
Location: OAT S15/S16/S17
Speaker: David Steurer (Cornell Univ., Ithaca, and Inst. of Advanced Study, Princeton)
Tensor decomposition is at the heart of many computational tasks that arise in machine learning and other kinds of data analyses, for example, learning mixtures of Gaussians and sparse dictionaries. I will present a recent algorithmic framework for this problem based on a powerful convex relaxation method called sum-of-squares. This framework gives the first polynomial-time algorithms with provable guarantees for many interesting classes of instances. What underlies our results is a general strategy for turning algorithms with prohibitive sample complexity into sample-efficient ones. Finally, we demonstrate how to extract fast and practical algorithms from our framework that do not require solving convex relaxations but still achieve comparable provable guarantees. The running times of the resulting algorithms are linear or close to linear in the size of the input. Based on joint works with Boaz Barak, Sam Hopkins, Jonathan Kelner, Tengyu Ma, Tselil Schramm, and Jonathan Shi (STOC'15, STOC'16, FOCS'16).
Automatic MiSe System Software Version 1.4803M | admin login