New Privacy Research Has Implications for Design and Policy


 at PrivacyTech: “Try visualizing the Internet’s basic architecture. Could you draw it? What would be your mental model for it?

Let’s be more specific: Say you just purchased shoes off a website using your mobile phone at work. How would you visualize that digital process? Would a deeper knowledge of this architecture make more apparent the myriad potential privacy risks in this transaction? Or to put it another way, what would your knowledge, or lack thereof, for these architectural underpinnings reveal about your understanding of privacy and security risks?

Whether you’re a Luddite or a tech wiz, creating these mental models of the Internet is not the easiest endeavor. Just try doing so yourself.

It is an exercise, however, that several individuals underwent for new research that has instructive implications for privacy and security pros.

“So everything I do on the Internet or that other people do on the Internet is basically asking the Internet for information, and the Internet is sending us to various places where the information is and then bringing us back.” – CO1

You’d think those who have a better understanding of how the Internet works would probably have a better understanding of the privacy and security risks, right? Most likely. Paradoxically, though, a better technological understanding may have very little influence on an individual’s response to potential privacy risks.

This is what a dedicated team of researchers from Carnegie Mellon University worked to discover recently in their award-winning paper, “My Data Just Goes Everywhere”: User Mental Models of the Internet and Implications for Privacy and Security—a culmination of research from Ruogu Kang, Laura Dabbish, Nathaniel Fruchter and Sara Kiesler—all from CMU’s Human-Computer Interaction Institute and the Heinz College in Pittsburgh, PA.

“I try to browse through the terms and conditions but there’s so much there I really don’t retain it.” – T11

Presented at the CyLab Usable Privacy and Security Laboratory’s (CUPS) 11thSymposium on Usable Privacy and Security (SOUPS), their research demonstrated that even though savvy and non-savvy users of the Internet have much different perceptions of its architecture, such knowledge was not predictive of whether a user would take the necessary steps to protect their privacy online. Experience, rather, appears to play a more determinate role.

Kang, who led the team, said she was surprised by the results….(More)”