Informational Friction as a Lens for Studying Algorithmic Aspects of Privacy

Patrick Skeba and Eric P. S. Baumer. (2020). Informational Friction as a Lens for Studying Algorithmic Aspects of Privacy. Proceedings of the ACM on Human-Computer Interaction 4, CSCW.

Abstract

This paper addresses challenges in conceptualizing privacy posed by algorithmic systems that can infer sensitive information from seemingly innocuous data. This type of privacy is of imminent concern due to the rapid adoption of machine learning and artificial intelligence systems in virtually every industry. In this paper, we suggest informational friction, a concept from Floridi’s ethics of information, as a valuable conceptual lens for studying algorithmic aspects of privacy. Informational friction describes the amount of work required for one agent to access or alter the information of another. By focusing on amount of work, rather than the type of information or manner in which it is collected, informational friction can help to explain why automated analyses should raise privacy concerns independently of, and in addition to, those associated with data collection. As a demonstration, this paper analyze law enforcement use of facial recognition, andFacebook’s targeted advertising model using informational friction and demonstrate risks inherent to these systems which are not completely identified in another popular framework, Nissenbaum’s Contextual Integrity.The paper concludes with a discussion of broader implications, both for privacy research and for privacy regulation.

DOI

Comments are closed.