Title: Feature Learning and Gradient Flows in Compositional Kernel Models
Abstract: The classical kernel ridge regression (KRR) problem fits the output Y as a function f of the input X, by minimizing a regularized loss over a fixed reproducing kernel Hilbert space (RKHS), such as a Sobolev space. We consider a natural extension in which the predictor takes the compositional form f(UX), where U is a learnable linear transformation and f lies in an RKHS. This leads to a nonlinear variational problem over the parameters f and U, and offers a simple, analytically tractable setting for studying feature learning within compositional models—namely, when and how such models can automatically identify task-relevant features through optimization, a phenomenon widely observed in modern neural network architectures.
In this talk, I will describe a canonical Riemannian gradient flow for finding stationary points within the compositional KRR objective, which—under Gaussian noise assumptions—admits a continuous family of Lyapunov functionals, revealing a mechanism for noise suppression and dimension reduction. I will discuss the result’s connections to feature learning in other compositional models, such as neural networks, and outline several open questions motivated by this perspective.
Audience
- Faculty/Staff
- Student
- Post Docs/Docs
- Graduate Students