Paper by Zhuang Liu, Michael Sockin & Wei Xiong: “This paper develops a foundation for a consumer’s preference for data privacy by linking it to the desire to hide behavioral vulnerabilities. Data sharing with digital platforms enhances the matching efficiency for standard consumption goods, but also exposes individuals with self-control issues to temptation goods. This creates a new form of inequality in the digital era—algorithmic inequality. Although data privacy regulations provide consumers with the option to opt out of data sharing, these regulations cannot fully protect vulnerable consumers because of data-sharing externalities. The coordination problem among consumers may also lead to multiple equilibria with drastically different levels of data sharing by consumers. Our quantitative analysis further illustrates that although data is non-rival and beneficial to social welfare, it can also exacerbate algorithmic inequality…(More)”.
How to contribute:
Did you come across – or create – a compelling project/report/book/app at the leading edge of innovation in governance?
Share it with us at info@thelivinglib.org so that we can add it to the Collection!
About the Curator
Get the latest news right in you inbox
Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday
Related articles
artificial intelligence, DATA, privacy
Co-creating Consent for Data Use — AI-Powered Ethics for Biomedical AI
Posted in September 10, 2025 by Stefaan Verhulst
privacy
Impacted Stakeholder Participation in AI and Data Governance
Posted in September 3, 2025 by Stefaan Verhulst
DATA, privacy
Why de-identified data sharing for research should be in the public interest
Posted in August 10, 2025 by Stefaan Verhulst