When:
Wednesday, February 14, 2018
11:00 AM - 12:00 PM CT
Where: 2006 Sheridan Road, B02, 2006 Sheridan Road , Evanston, IL 60208 map it
Audience: Faculty/Staff - Student - Post Docs/Docs - Graduate Students
Cost: Free
Contact:
Kisa Kowal
(847) 491-3974
Group: Department of Statistics and Data Science
Category: Academic
Statistical Learning for Time Dependent Data
Time: 11:00 a.m.
Speaker: Likai Chen, PhD candidate, Department of Statistics, University of Chicago
Place: Basement classroom - B02, Department of Statistics, 2006 Sheridan Road
Abstract: In statistical learning theory, researchers primarily deal with independent data and there is a huge literature. In comparison, it has been much less investigated for time dependent data, which are commonly encountered in economics, engineering, finance, geography, physics and other fields. In this talk, we focuses on concentration inequalities for suprema of empirical processes which plays a fundamental role in the statistical learning theory. We derive a Gaussian approximation and an upper bound for the tail probability of the suprema under conditions on the size of the function class, the sample size, temporal dependence and the moment conditions of the underlying time series. Due to the dependence and heavy-tailness, our tail probability bound is substantially different from those classical exponential bounds obtained under the independence assumption in that it involves an extra polynomial decaying term. We allow both short- and long-range dependent processes, where the long-range dependence case has never been previously explored. We showed our tail probability inequality is sharp up to a multiplicative constant. These bounds work as theoretical guarantees for statistical learning applications under dependence.