ORIE Colloquium: Angela Zhou / Lijun Ding (ORIE PhD Students)

Location

Description

Robust Personalization from Observational Data Angela Zhou Learning to make decisions from datasets in realistic environments is subject to practical challenges such as unobserved confounders, missingness, and bias, which may undermine the otherwise beneficial impacts of data-driven decisions. In this talk, I introduce a methodological framework for learning causal-effect maximizing personalized decision policies in the presence of unobserved confounders. Recent work unilaterally assumes unconfoundedness: that there are no unobserved confounders affecting treatment and outcome, which is often untrue for widely available observational data. I develop a methodological framework that accounts for possible unobserved confounding by minimizing the worst-case estimated regret over an ambiguity set for propensity weights. I prove generalization guarantees and a semi-synthetic case study on personalizing hormone replacement therapy based on the parallel WHI observational study and clinical trial. Hidden confounding can lead to unwarranted harm, while the novel robust approach guarantees safety and focuses on well-evidenced improvement. In the second part of this talk, I highlight follow-up work on leveraging these ideas for developing robust bounds for off-policy policy evaluation in batch (offline) reinforcement learning in the infinite-horizon setting. Bio: Angela Zhou a fifth-year Ph.D. student at Cornell in Operations Research and Information Engineering. currently located at Cornell Tech (NYC) where she works with Nathan Kallus. Her work was previously supported by an NDSEG fellowship. Previously, Zhou studied operations research and financial engineering at Princeton University with minors in statistics and machine learning/computer science. Angela is interested in developing prescriptive analytics with theoretical guarantees for data-driven decision-making. Currently, she is working on leveraging causal inference and machine learning as a language for prescriptive analytics, making robust recommendations for action in view of fundamental practical challenges in observational/operational data. Her work emphasizes credibility as a form of reliability, developing robust inferential procedures subject to analyst-tunable violations of assumptions. More broadly, Zhou is interested in the interplay of statistics and optimization for decision-making, with applications to e-commerce, healthcare, and policy. Low-rank matrix optimization Lijun Ding This talk consists of two parts: (1) semidefinite programming with a low-rank solution; (2) statistical low-rank matrix recovery. In the first part, I will present a storage optimal and time-efficient algorithm, called CSSDP (complementary slackness SDP), in solving weakly constrained semidefinite programs with low-rank solutions. I shall present the algorithm, the use of complementary slackness in designing it, and a comparison of complexities with past solvers. In the second part, I will present an algorithm, called AVPG (averaging projected gradient), for solving statistical rank constrained problems. I shall present its main application in a generalized linear model with rank constraints, the advantage of it over existing algorithms, and an idea of the proof for its global linear convergence. Bio: Lijun Ding is a fifth-year Ph.D. student in the School of Operations Research and Information Engineering at Cornell University and is advised by Prof. Yudong Chen and Prof. Madeleine Udell. Prior to joining Cornell in the fall of 2016, he graduated with a Master of Science in statistics from the University of Chicago advised by Prof. Lek-Heng Lim. Ding received a Bachelor of Science in mathematics and economics from the Hong Kong University of Science and Technology in 2014. He worked as a research intern at the Alibaba DAMO Academy during the summer of 2019. Ding's research lies at the intersection of optimization, statistics, and machine learning, where he works on solving large-scale and high-dimensional optimization problems. By exploring ideas and techniques such as Frank-Wolfe, strict complementarity, and the leave-one-out argument in these fields, he has been able to design computationally and statistically efficient algorithms for both classical convex optimization problems such as semidefinite programming, and newly arising nonconvex problems. Lijun's work on an optimal-storage approach to SDP using approximate complementarity recently won the Student Paper Prize 2019 of INFORMS Optimization Society.