Optimizing Finite Sample Performance of Differential Privacy
Open Access
- Author:
- Awan, Jordan
- Graduate Program:
- Statistics
- Degree:
- Doctor of Philosophy
- Document Type:
- Dissertation
- Date of Defense:
- February 25, 2020
- Committee Members:
- Aleksandra B Slavkovic, Dissertation Advisor/Co-Advisor
Aleksandra B Slavkovic, Committee Chair/Co-Chair
Matthew Logan Reimherr, Committee Member
Bharath Kumar Sriperumbudur, Committee Member
Daniel Kifer, Outside Member
Matthew Logan Reimherr, Dissertation Advisor/Co-Advisor
Ephraim Mont Hanks, Program Head/Chair - Keywords:
- Statistical Disclosure Control
Statistical Disclosure Limitation
Empirical Risk Minimization
Regression - Abstract:
- Differential Privacy (DP) is a rigorous framework which quantifies the disclosure risk of statistical procedures computed on sensitive data. DP methods/mechanisms require the introduction of additional randomness beyond sampling in order to limit the disclosure risk. However, their implementations often introduce excessive noise, reducing the utility and validity of statistical results, especially in small samples. In this dissertation, we develop differentially private methods for problems of estimation and inference, with the goal of optimizing finite sample performance. In particular we 1) develop the first uniformly most powerful tests within the framework of DP, along with optimal confidence intervals and confidence distributions, 2) optimize the performance of the K-norm mechanisms, with applications to private linear and logistic regression, 3) extend and analyze the exponential mechanism to function spaces and derive its asymptotic properties, and 4) propose the K-Norm Gradient (KNG) mechanism as an alternative to the exponential mechanism, with improved asymptotic performance.