When:
Friday, April 21, 2023
11:00 AM - 12:00 PM CT
Where: Chambers Hall, Ruan Conference Room – lower level , 600 Foster St, Evanston, IL 60208 map it
Audience: Faculty/Staff - Post Docs/Docs - Graduate Students
Contact:
Kisa Kowal
(847) 491-3974
Group: Department of Statistics and Data Science
Category: Academic, Lectures & Meetings
Invertible normalizing flow neural networks for statistical inference
Yao Xie, Associate Professor and Harold R. and Mary Anne Nash Early Career Professor, H. Milton Stewart School of Industrial and Systems Engineering, and Associate Director of the Machine Learning Center, Georgia Institute of Technology
Abstract: Normalizing flow is a class of deep generative models. In practice, the flow often appears as a chain of neural network blocks that estimate the score function (i.e., the gradient of the log probability density with respect to data), which may encounter challenges in learning and sampling high-dimensional data and call for special techniques. For statistical inference, we are more interested in the estimation of the density and density ratio rather than generating samples; thus, in this talk, I will present a neural ODE flow with the transport regularization framework that achieves an efficient invertible neural network design for density and density ratio estimation. Compared with diffusion models, the proposed method trains a neural ODE model directly without SDE sampling or learning of score matching. Our approach greatly reduces memory consumption and computational cost when achieving competitive or better performance compared to existing flow and diffusion models. This is a work with Chen Xu at Georgia Institute of Technology and Xiuyuan Cheng at Duke University.