Computer Science Thesis Proposal

— 11:30am

Location:
In Person - Traffic21 Classroom, Gates Hillman 6501

Speaker:
JUSTIN WHITEHOUSE , Ph.D. Student, Computer Science Department, Carnegie Mellon University
https://jwhitehouse11.github.io/

Modern Martingale Methods for Sequential Estimation

Martingale concentration is at the heart of sequential statistical analysis. Due to their time-uniform concentration of measure properties, martingales allow researchers to perform statistical inference as data is adaptively collected over time. Many state-of-the-art results in areas such as differential privacy, multi-armed bandit optimization, and online learning boil down to (a) finding an appropriate, problem-dependent martingale and (b) carefully bounding its growth.

Despite the important roles martingales and time-uniform martingale concentration play in modern statistical tasks, applications of martingale concentration is typically ad-hoc. Often, poorly chosen martingale concentration inequalities are applied, which results in suboptimal, even vacuous rates in sequential estimation problems. In this thesis proposal, we make a two-fold contribution to the study of martingales.

First, we show how carefully applying martingale concentration inequalities can obtain meaningful improvements in terms of rates and constants in practically-relevant data science tasks. Second, we construct general families of time-uniform concentration inequalities for self-normalized martingales. In particular, our results hold under very general noise conditions in both univariate and high-dimensional settings. In sum, this proposal provides a framework for better understanding the important role of martingales in modern estimation tasks.

Thesis Committee:

Steven Zhiwei Wu (Co-Chair)
Aaditya Ramdas (Co-Chair)
Weina Wang
Aarti Singh
Csaba Szepesvári (University of Alberta)
Emilie Kaufmann (University of Lille)

Additional Information


Add event to Google
Add event to iCal