The Data Campaign: Interrogating Online Consent Through a Unified Deception Framework

Open Access
- Author:
- Howard, Riley
- Graduate Program:
- Informatics
- Degree:
- Master of Science
- Document Type:
- Master Thesis
- Date of Defense:
- March 18, 2025
- Committee Members:
- Sarah J Stager, Thesis Advisor/Co-Advisor
Aiping Xiong, Committee Member
Carleen Maitland, Program Head/Chair
Luke Zhang, Committee Member
William Parquette, Special Signatory - Keywords:
- Privacy
Deception
Dark Patterns
Systems-Noise Model - Abstract:
- Deception is a fundamental aspect of human interactions, extending beyond military and intelligence applications into digital environments where companies manipulate user perceptions and decision-making. This thesis introduces the Systems-Noise Model, a novel framework applying military and intelligence deception theories, cognitive sciences, and privacy research to analyze the systematic use of deception tactics by companies that benefit from invasive data practices at the expense of user privacy (privacy deception). The model conceptualizes deception as a continuous process involving signals (information, accurate, misleading, or deceptive), noise (factors impacting signals), and feedback loops (adaptive deception strategies) and offers a new way for users, researchers, and regulators to analyze privacy deception systematically. The proposed Systems-Noise Model is then demonstrated through a detailed case study of Facebook, highlighting how the platform employs denial, deceit, and misdirection to obscure data practices, mislead users and regulators, and nudge user behavior in ways that undermine their autonomy. By mapping Facebook’s documented deceptive practices onto the Systems-Noise Model, the case study demonstrates how privacy deception is not incidental but rather a carefully crafted strategy to maximize profit while minimizing user resistance. The case study further reinforces the argument that deceptive consent mechanisms are an industry-wide issue, requiring stronger intervention strategies and penalties that serve as deterrents. Finally, the paper outlines counterdeception strategies for users and regulators to identify and deter privacy deception and empower user autonomy in online environments.