Representation learning with iteratively reweighted kernel machines
Dmitriy Drusvyatskiy, Professor and HDSI Faculty Fellow, Halıcıoğlu Data Science Institute (HDSI), University of California San Diego
Abstract: The impressive practical performance of neural networks is often attributed to their ability to learn low-dimensional data representations and hierarchical structure directly from data. In this work, we argue that these two phenomena are not unique to neural networks, and surprisingly can be elicited from classical kernel methods. Namely, we show that the derivative of the kernel predictor can detect the influential coordinates with low sample complexity. Moreover, by iteratively using the derivatives to reweight the data and retrain kernel machines, one is able to efficiently learn hierarchical polynomials in a high dimensionsional regime. I will illustrate the developed theory with numerical experiments on both synthetic and real data sets.
Cost: free
Audience
- Faculty/Staff
- Student
- Post Docs/Docs
- Graduate Students
Contact
Kisa Kowal
(847) 491-3974
Email
Interest
- Academic (general)