When:
Wednesday, February 26, 2025
12:00 PM - 1:00 PM CT
Where: Mudd Hall ( formerly Seeley G. Mudd Library), 3514, 2233 Tech Drive, Evanston, IL 60208 map it
Audience: Faculty/Staff - Student - Post Docs/Docs - Graduate Students
Cost: free
Contact:
Wynante R Charles
(847) 467-8174
Group: Department of Computer Science (CS)
Category: Academic, Lectures & Meetings
Wednesday / CS Seminar
February 26th / 12:00 PM
Hybrid / Mudd 3514
Speaker
Tianyu Gao, Princeton University
Talk Title
Enabling Language Models to Process Information at Scale
Abstract
Language models (LMs) are highly effective at understanding and generating text, holding immense potential as intuitive, personalized interfaces for accessing information. Expanding their ability to gather and synthesize large volumes of information will further unlock transformative applications, ranging from generative search engines to AI literature assistants. In this talk, I will present my research on advancing LMs for information processing at scale. (1) I will present my evaluation framework for LM-based information-seeking systems, emphasizing the importance of providing citations for verifying the model-generated answers. Our evaluation highlights shortcomings in LMs’ abilities to reliably process long-form texts (e.g., dozens of webpages), which I address by developing state-of-the-art long-context LMs that outperform leading industry efforts while using a small fraction of the computational budget. (2) I will then introduce my foundational work on using contrastive learning to produce performant text embeddings, which form the cornerstone of effective and scalable search. (3) In addition to building systems that can process large-scale information, I will discuss my contributions to creating efficient pre-training and adaptation methods for LMs, which enable scalable deployment of LM-powered applications across diverse settings. Finally, I will share my vision for the next generation of autonomous information processing systems and outline the foundational challenges that must be addressed to realize this vision.
Biography
Tianyu Gao is a fifth-year PhD student in the Department of Computer Science at Princeton University, advised by Danqi Chen. His research focuses on developing principled methods for training and adapting language models, many of which have been widely adopted across academia and industry. Driven by transformative applications, such as using language models as information-seeking tools, his work also advances robust evaluation and fosters a deeper understanding to guide the future development of language models. He led the first workshop on long-context foundation models at ICML 2024. He won an outstanding paper award at ACL 2022 and received an IBM PhD Fellowship in 2023. Before Princeton, he received his BEng from Tsinghua University in 2020.
Research/Interest Areas
Natural language processing, language models
---
Zoom: https://northwestern.zoom.us/j/91611540710?pwd=7yBeDMdu6jAcoK2wFQj9Pal31bxk6K.1
Panopto: https://northwestern.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=c651c906-f33f-48e5-87f1-b2890117aac0
Community Connections Topic: Supporting First Generation Students