Department of Computer Science | Institute of Theoretical Computer Science | CADMO
Prof. Emo Welzl and Prof. Bernd Gärtner
Mittagsseminar Talk Information |
Date and Time: Tuesday, May 02, 2017, 12:15 pm
Duration: 30 minutes
Location: OAT S15/S16/S17
Speaker: Kenneth Clarkson (IBM Research Almaden)
A number of matrices that arise in machine learning and data analysis are symmetric positive semidefinite (PSD), including covariance matrices, kernel matrices, Laplacian matrices, random dot product graph models, and others. A common task related to such matrices is to approximate them with a low-rank matrix, for efficiency or statistical inference; spectral clustering, kernel PCA, manifold learning, and Gaussian process regression can all involve this task. Given a square n by n matrix A, target rank k, and error parameter epsilon, we show how to find a PSD matrix B of rank k, such that the Frobenius norm of A-B is within (1+epsilon) of best possible; our algorithm needs O(nnz(A) + n*poly(k/epsilon)) time to do this, where nnz(A) is the number of nonzero entries of A. We also show how to find such a rank-k matrix PSD B of the form CUC^T, where the O(k/eps) columns of C are a subset of those of A, with a similar runtime. Joint work with David Woodruff.
Upcoming talks | All previous talks | Talks by speaker | Upcoming talks in iCal format (beta version!)
Previous talks by year: 2024 2023 2022 2021 2020 2019 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996
Information for students and suggested topics for student talks
Automatic MiSe System Software Version 1.4803M | admin login