דלג לתוכן (מקש קיצור 's')
אירועים

אירועים והרצאות בפקולטה למדעי המחשב ע"ש הנרי ומרילין טאוב

Theory Seminar - On Reductions of Learning Problems and the Borsuk-Ulam theorem
event speaker icon
טום וקנין (טכניון)
event date icon
יום רביעי, 07.05.2025, 13:00
event location icon
טאוב 201

Many practical prediction algorithms represent inputs in Euclidean space and replace the discrete 0/1 classification loss with a real-valued surrogate loss, effectively reducing classification tasks to stochastic optimization. In this talk, I will explore the expressiveness of such reductions in terms of key resources. I will formally define a general notion of reductions between learning problems, and realte it some well known examples such as representations by half-spaces.
I will then establish bounds on the minimum Euclidean dimension D required to reduce a concept class with VC dimension d to a Stochastic Convex Optimization (SCO) problem in a D dimensional Euclidean space. This result provides a formal framework for understanding the intuitive interpretation of the VC dimension as the number of parameters needed to learn a given class.
The proof leverages a clever application of the Borsuk-Ulam theorem, illustrating an intersting connection between topology and learning theory.