Advances in Forward Sufficient Dimension Reduction Methods for Statistical Learning
Restricted (Penn State Only)
Author:
Quach, Harris
Graduate Program:
Statistics
Degree:
Doctor of Philosophy
Document Type:
Dissertation
Date of Defense:
June 15, 2022
Committee Members:
Ephraim Mont Hanks, Professor in Charge/Director of Graduate Studies Bharath Kumar Sriperumbudur, Major Field Member Yanyuan Ma, Major Field Member Bing Li, Chair & Dissertation Advisor Alexei Novikov, Outside Unit & Field Member
Keywords:
ordinal data functional data classification outer product of canonical gradients tensor product of gradients local linear function on function regression k-means tuning
Abstract:
Modern information collection methods continue to generate an abundance of data from
which practitioners, in fields ranging from the social sciences and humanities to the
natural and biomedical sciences, attempt to draw new insights and discoveries through
statistical analyses. Against this backdrop, dimension reduction methods have come
to prominence as a tool for extracting important or informative features from high
dimensional data sets. In particular, Sufficient Dimension Reduction (SDR) methods
have become a popular and effective tool for supervised dimension reduction. Since the
seminal paper by K.-C. Li (1991), Sufficient Dimension Reduction has developed at a
rapid pace, with inverse regression methods having undergone substantial development
and becoming widely applicable. Alternative approaches to sufficient dimension reduction
have received relatively less attention, despite being less restrictive and more effective
in many scenarios. The objective of this thesis is to advance the forward regression
approach for sufficient dimension reduction by developing methods that are more effective
for categorical and ordinal responses, and methods that apply to functional data.