Privacy and Interoperability Challenges Could Limit the Benefits of Education Technology


Report by Katharina Ley Best and John F. Pane: “The expansion of education technology is transforming the learning environment in classrooms, schools, school systems, online, and at home. The rise of education technology brings with it an increased opportunity for the collection and application of data, which are valuable resources for educators, schools, policymakers, researchers, and software developers.

RAND researchers examine some of the possible implications of growing data collection and availability related to education technology. Specifically, this Perspective discusses potential data infrastructure challenges that could limit data usefulness, consider data privacy implications in an education technology context, and review privacy principles that could help educators and policymakers evaluate the changing education data privacy landscape in anticipation of potential future changes to regulations and best practices….(More)”.

Open Data, Grey Data, and Stewardship: Universities at the Privacy Frontier.


Paper by Christine L. Borgman: “As universities recognize the inherent value in the data they collect and hold, they encounter unforeseen challenges in stewarding those data in ways that balance accountability, transparency, and protection of privacy, academic freedom, and intellectual property. Two parallel developments in academic data collection are converging: (1) open access requirements, whereby researchers must provide access to their data as a condition of obtaining grant funding or publishing results in journals; and (2) the vast accumulation of “grey data” about individuals in their daily activities of research, teaching, learning, services, and administration.

The boundaries between research and grey data are blurring, making it more difficult to assess the risks and responsibilities associated with any data collection. Many sets of data, both research and grey, fall outside privacy regulations such as HIPAA, FERPA, and PII. Universities are exploiting these data for research, learning analytics, faculty evaluation, strategic decisions, and other sensitive matters. Commercial entities are besieging universities with requests for access to data or for partnerships to mine them. The privacy frontier facing research universities spans open access practices, uses and misuses of data, public records requests, cyber risk, and curating data for privacy protection. This Article explores the competing values inherent in data stewardship and makes recommendations for practice by drawing on the pioneering work of the University of California in privacy and information security, data governance, and cyber risk….(More)”.

Revisiting the governance of privacy: Contemporary policy instruments in global perspective


Colin J. Bennett and Charles D. Raab at Regulation & Governance: “The repertoire of policy instruments within a particular policy sector varies by jurisdiction; some “tools of government” are associated with particular administrative and regulatory traditions and political cultures. It is less clear how the instruments associated with a particular policy sector may change over time, as economic, social, and technological conditions evolve.

In the early 2000s, we surveyed and analyzed the global repertoire of policy instruments deployed to protect personal data. In this article, we explore how those instruments have changed as a result of 15 years of social, economic and technological transformations, during which the issue has assumed a far higher global profile, as one of the central policy questions associated with modern networked communications.

We review the contemporary range of transnational, regulatory, self‐regulatory, and technical instruments according to the same framework, and conclude that the types of policy instrument have remained relatively stable, even though they are now deployed on a global scale.

While the labels remain the same, however, the conceptual foundations for their legitimation and justification are shifting as greater emphases on accountability, risk, ethics, and the social/political value of privacy have gained purchase. Our analysis demonstrates both continuity and change within the governance of privacy, and displays how we would have tackled the same research project today.

As a broader case study of regulation, it highlights the importance of going beyond technical and instrumental labels. Change or stability of policy instruments does not take place in isolation from the wider conceptualizations that shape their meaning, purpose, and effect…(More)”.

Making Wage Data Work: Creating a Federal Resource for Evidence and Transparency


Christina Pena at the National Skills Coalition: “Administrative data on employment and earnings, commonly referred to as wage data or wage records, can be used to assess the labor market outcomes of workforce, education, and other programs, providing policymakers, administrators, researchers, and the public with valuable information. However, there is no single readily accessible federal source of wage data which covers all workers. Noting the importance of employment and earnings data to decision makers, the Commission on Evidence-Based Policymaking called for the creation of a single federal source of wage data for statistical purposes and evaluation. They recommended three options for further exploration: expanding access to systems that already exist at the U.S. Census Bureau or the U.S. Department of Health and Human Services (HHS), or creating a new database at the U.S. Department of Labor (DOL).

This paper reviews current coverage and allowable uses, as well as federal and state actions required to make each option viable as a single federal source of wage data that can be accessed by government agencies and authorized researchers. Congress and the President, in conjunction with relevant federal and state agencies, should develop one or more of those options to improve wage information for multiple purposes. Although not assessed in the following review, financial as well as privacy and security considerations would influence the viability of each scenario. Moreover, if a system like the Commission-recommended National Secure Data Service for sharing data between agencies comes to fruition, then a wage system might require additional changes to work with the new service….(More)”

Uninformed Consent


Leslie K. John at Harvard Business Review: “…People are bad at making decisions about their private data. They misunderstand both costs and benefits. Moreover, natural human biases interfere with their judgment. And whether by design or accident, major platform companies and data aggregators have structured their products and services to exploit those biases, often in subtle ways.

Impatience. People tend to overvalue immediate costs and benefits and underweight those that will occur in the future. They want $9 today rather than $10 tomorrow. On the internet, this tendency manifests itself in a willingness to reveal personal information for trivial rewards. Free quizzes and surveys are prime examples. …

The endowment effect. In theory people should be willing to pay the same amount to buy a good as they’d demand when selling it. In reality, people typically value a goodless when they have to buy it. A similar dynamic can be seen when people make decisions about privacy….

Illusion of control. People share a misapprehension that they can control chance processes. This explains why, for example, study subjects valued lottery tickets that they had personally selected more than tickets that had been randomly handed to them. People also confuse the superficial trappings of control with real control….

Desire for disclosure. This is not a decision-making bias. Rather, humans have what appears to be an innate desire, or even need, to share with others. After all, that’s how we forge relationships — and we’re inherently social creatures…

False sense of boundaries. In off-line contexts, people naturally understand and comply with social norms about discretion and interpersonal communication. Though we may be tempted to gossip about someone, the norm “don’t talk behind people’s backs” usually checks that urge. Most of us would never tell a trusted confidant our secrets when others are within earshot. And people’s reactions in the moment can make us quickly scale back if we disclose something inappropriate….(More)”.

Digital Deceit II: A Policy Agenda to Fight Disinformation on the Internet


We have developed here a broad policy framework to address the digital threat to democracy, building upon basic principles to recommend a set of specific proposals.

Transparency: As citizens, we have the right to know who is trying to influence our political views and how they are doing it. We must have explicit disclosure about the operation of dominant digital media platforms — including:

  • Real-time and archived information about targeted political advertising;
  • Clear accountability for the social impact of automated decision-making;
  • Explicit indicators for the presence of non-human accounts in digital media.

Privacy: As individuals with the right to personal autonomy, we must be given more control over how our data is collected, used, and monetized — especially when it comes to sensitive information that shapes political decision-making. A baseline data privacy law must include:

  • Consumer control over data through stronger rights to access and removal;
  • Transparency for the user of the full extent of data usage and meaningful consent;
  • Stronger enforcement with resources and authority for agency rule-making.

Competition: As consumers, we must have meaningful options to find, send and receive information over digital media. The rise of dominant digital platforms demonstrates how market structure influences social and political outcomes. A new competition policy agenda should include:

  • Stronger oversight of mergers and acquisitions;
  • Antitrust reform including new enforcement regimes, levies, and essential services regulation;
  • Robust data portability and interoperability between services.

There are no single-solution approaches to the problem of digital disinformation that are likely to change outcomes. … Awareness and education are the first steps toward organizing and action to build a new social contract for digital democracy….(More)”

The Three Goals and Five Functions of Data Stewards


Medium Article by Stefaan G. Verhulst: “…Yet even as we see more data steward-type roles defined within companies, there exists considerable confusion about just what they should be doing. In particular, we have noticed a tendency to conflate the roles of data stewards with those of individuals or groups who might be better described as chief privacy, chief data or security officers. This slippage is perhaps understandable, but our notion of the role is somewhat broader. While privacy and security are of course key components of trusted and effective data collaboratives, the real goal is to leverage private data for broader social goals — while preventing harm.

So what are the necessary attributes of data stewards? What are their roles, responsibilities, and goals of data stewards? And how can they be most effective, both as champions of sharing within organizations and as facilitators for leveraging data with external entities? These are some of the questions we seek to address in our current research, and below we outline some key preliminary findings.

The following “Three Goals” and “Five Functions” can help define the aspirations of data stewards, and what is needed to achieve the goals. While clearly only a start, these attributes can help guide companies currently considering setting up sharing initiatives or establishing data steward-like roles.

The Three Goals of Data Stewards

  • Collaborate: Data stewards are committed to working and collaborating with others, with the goal of unlocking the inherent value of data when a clear case exists that it serves the public good and that it can be used in a responsible manner.
  • Protect: Data stewards are committed to managing private data ethically, which means sharing information responsibly, and preventing harm to potential customers, users, corporate interests, the wider public and of course those individuals whose data may be shared.
  • Act: Data stewards are committed to pro-actively acting in order to identify partners who may be in a better position to unlock value and insights contained within privately held data.

…(More)”.

Google, T-Mobile Tackle 911 Call Problem


Sarah Krouse at the Wall Street Journal: “Emergency call operators will soon have an easier time pinpointing the whereabouts of Android phone users.

Google has struck a deal with T-Mobile US to pipe location data from cellphones with Android operating systems in the U.S. to emergency call centers, said Fiona Lee, who works on global partnerships for Android emergency location services.

The move is a sign that smartphone operating system providers and carriers are taking steps to improve the quality of location data they send when customers call 911. Locating callers has become a growing problem for 911 operators as cellphone usage has proliferated. Wireless devices now make 80% or more of the 911 calls placed in some parts of the U.S., according to the trade group National Emergency Number Association. There are roughly 240 million calls made to 911 annually.

While landlines deliver an exact address, cellphones typically register only an estimated location provided by wireless carriers that can be as wide as a few hundred yards and imprecise indoors.

That has meant that while many popular applications like Uber can pinpoint users, 911 call takers can’t always do so. Technology giants such as Google and Apple Inc. that run phone operating systems need a direct link to the technology used within emergency call centers to transmit precise location data….

Google currently offers emergency location services in 14 countries around the world by partnering with carriers and companies that are part of local emergency communications infrastructure. Its location data is based on a combination of inputs from Wi-Fi to sensors, GPS and a mobile network information.

Jim Lake, director at the Charleston County Consolidated 9-1-1 Center, participated in a pilot of Google’s emergency location services and said it made it easier to find people who didn’t know their location, particularly because the area draws tourists.

“On a day-to-day basis, most people know where they are, but when they don’t, usually those are the most horrifying calls and we need to know right away,” Mr. Lake said.

In June, Apple said it had partnered with RapidSOS to send iPhone users’ location information to 911 call centers….(More)”

Is Mass Surveillance the Future of Conservation?


Mallory Picket at Slate: “The high seas are probably the most lawless place left on Earth. They’re a portal back in time to the way the world looked for most of our history: fierce and open competition for resources and contested territories. Pirating continues to be a way to make a living.

It’s not a complete free-for-all—most countries require registration of fishing vessels and enforce environmental protocols. Cooperative agreements between countries oversee fisheries in international waters. But the best data available suggests that around 20 percent of the global seafood catch is illegal. This is an environmental hazard because unregistered boats evade regulations meant to protect marine life. And it’s an economic problem for fishermen who can’t compete with boats that don’t pay for licenses or follow the (often expensive) regulations. In many developing countries, local fishermen are outfished by foreign vessels coming into their territory and stealing their stock….

But Henri Weimerskirch, a French ecologist, has a cheap, low-impact way to monitor thousands of square miles a day in real time: He’s getting birds to do it (a project first reported by Hakai). Specifically, albatross, which have a 10-foot wingspan and can fly around the world in 46 days. The birds naturally congregate around fishing boats, hoping for an easy meal, so Weimerskirch is equipping them with GPS loggers that also have radar detection to pick up the ship’s radar (and make sure it is a ship, not an island) and a transmitter to send that data to authorities in real time. If it works, this should help in two ways: It will provide some information on the extent of the unofficial fishing operation in the area, and because the logger will transmit their information in real time, the data will be used to notify French navy ships in the area to check out suspicious boats.

His team is getting ready to deploy about 80 birds in the south Indian Ocean this November.
The loggers attached around the birds’ legs are about the shape and size of a Snickers. The south Indian Ocean is a shared fishing zone, and nine countries, including France (courtesy of several small islands it claims ownership of, a vestige of colonialism), manage it together. But there are big problems with illegal fishing in the area, especially of the Patagonian toothfish (better known to consumers as Chilean seabass)….(More)”

Rohingya turn to blockchain to solve identity crisis


Skot Thayer and Alex Hern at the Guardian: “Rohingya refugees are turning to blockchain-type technology to help address one of their most existential threats: lack of officially-recognised identity.

Denied citizenship in their home country of Myanmar for decades, the Muslim minority was the target of a brutal campaign of violence by the military which culminated a year ago this week. A “clearance operation” led by Buddhist militia sent more than 700,000 Rohingya pouring over the border into Bangladesh, without passports or official ID.

The Myanmar government has since agreed to take the Rohingya back, but are refusing to grant them citizenship. Many Rohingya do not want to return and face life without a home or an identity. This growing crisis prompted Muhammad Noor and his team at the Rohingya Project to try to find a digital solution.

“Why does a centralised entity like a bank or government own my identity,” says Noor, a Rohingya community leader based in Kuala Lumpur. “Who are they to say if I am who I am?”

Using blockchain-based technology, Noor, is trialling the use of digital identity cards that aim to help Rohingya in Malaysia, Bangladesh and Saudi Arabia access services such as banking and education. The hope is that successful trials might lead to a system that can help the community across southeast Asia.

Under the scheme, a blockchain database is used to record individual digital IDs, which can then be issued to people once they have taken a test to verify that they are genuine Rohingya….

Blockchain-based initiatives, such as the Rohingya Project, could eventually allow people to build the network of relationships necessary to participate in the modern global economy and prevent second and third generation “invisible” people from slipping into poverty. It could also allow refugees to send money across borders, bypassing high transaction fees.

In Jordan’s Azraq refugee camp, the United Nations World Food Programme (WFP) is using blockchain and biometrics to help Syrian refugees to purchase groceries using a voucher system. This use of the technology allows the WFP to bypass bank fees.

But Al Rjula says privacy is still an issue. “The technology is maturing, yet implementation by startups and emerging tech companies is still lacking,” he says.

The involvement of a trendy technology such as blockchains can often be enough to secure the funding, attention and support that start-ups – whether for-profit or charitable – need to thrive. But companies such as Tykn still have to tackle plenty of the same issues as their old-fashioned database-using counterparts, from convincing governments and NGOs to use their services in the first place to working out how to make enough overhead to pay staff, while also dealing with the fickle issues of building on a cutting-edge platform.

Blockchain-based humanitarian initiatives will also need to reckon with the problem of accountability in their efforts to aid refugees and those trapped in the limbo of statelessness.

Dilek Genc, a PhD candidate at the University of Edinburgh who studies blockchain-type applications in humanitarian aid and development, saysif the aid community continues to push innovation using Silicon Valley’s creed of “fail fast and often,” and experiment on vulnerable peoples they will be fundamentally at odds with humanitarian principles and fail to address the political roots of issues facing refugees…(More)”.