Data Science & Deep Learning Seminar: Performance of Johnson-Lindenstrauss Transform for k-Means and k-Medians Clustering

Speaker:
Konstantin Makarychev (Northwestern University)
Date:
Monday, 2.12.2019, 12:30
Place:
Taub 301 Taub Bld.

Consider an instance of Euclidean k-means or k-medians clustering. We show that the cost of the optimal solution is preserved up to a factor of (1+ε) under a projection onto a random - dimensional subspace. Further, the cost of every clustering is preserved within (1+ε). Our bound on the dimension is nearly optimal. Additionally, our result applies to Euclidean k-clustering with the distances raised to the p-th power for any constant p.

Dr. Makarychev is an Associate Professor of Computer Science at Northwestern University. Before joining Northwestern, he was a researcher at Microsoft and IBM Research Labs. He received his undergraduate degree at the Department of Mathematics at Moscow State University and graduated from Princeton University in 2007. His PhD advisor was Moses Charikar. Dr. Makarychev is interested in designing efficient algorithms for computationally hard problems. The aim of his research is to introduce new core techniques and design general principles for developing and analyzing algorithms that work in theory and practice. His research interests include approximation algorithms, beyond worst-case analysis, theory of machine learning, and applications of high-dimension geometry in computer science.

Back to the index of events