Title: Information Matrices and Group Structures in Sufficient Dimension Reduction
In most contemporary scientific fields, regression and classification problems comprise a large number of potential predictor variables. With a literature dating back to the early ‘90s, Sufficient Dimension Reduction (SDR) provides a nimble and effective toolkit for handling these high dimensional supervised problems. Specifically, SDR techniques identify a small number of composite features, most commonly linear combinations, which capture the core of the relationship between the response and the original predictor variables. In this talk we will highlight two important trajectories along which SDR has evolved in recent years, namely: (i) methodology based on non-parametric estimation of conditional mean or density functions – which eliminates restrictive assumptions typical of earlier reduction techniques; and (ii) approaches that leverage known structures (e.g. groups) to create more meaningful reduced representations of the data. As instances, we will focus on two novel tools: the Covariate Information Matrix (CIM) and the structured Ordinary Least Squares (sOLS). Work on CIM is joined with Debmalya Nandy (PSU), Weixin Yao (UC Riverside) and the late Bruce Lindsay (PSU). Work on sOLS is joined with Yang Liu (Bank of America) and Bing Li (PSU).