Prof. Emo Welzl and Prof. Bernd Gärtner
|Mittagsseminar Talk Information|
Date and Time: Tuesday, March 22, 2011, 12:15 pm
Duration: This information is not available in the database
Location: OAT S15/S16/S17
Speaker: Kenneth Clarkson (IBM Research Almaden)
I will describe randomized approximation algorithms for some classical problems of machine learning, where the algorithms have provable bounds that hold with high probability. Some of our algorithms are sublinear, that is, they do not need to touch all the data. Specifically, for a set of points a_1...a_n in d dimensions, we show that finding a d-vector x that approximately maximizes the margin min_i a_i dot x can be done in O(n+d)/epsilon^2 time, up to logarithmic factors, where epsilon>0 is an additive approximation parameter. Our algorithm is a primal-dual version of the classical perceptron training algorithm, in which both the primal and the dual variables are updated using randomization. We have a similar result for the problem of finding the smallest ball containing the input points. We also give versions of our algorithms for the Gaussian and polynomial kernels, with an additional runtime cost of O(1/epsilon^5).
Joint work with Elad Hazan and David Woodruff.
Automatic MiSe System Software Version 1.4803M | admin login