Digital Deceit II: A Policy Agenda to Fight Disinformation on the Internet


We have developed here a broad policy framework to address the digital threat to democracy, building upon basic principles to recommend a set of specific proposals.

Transparency: As citizens, we have the right to know who is trying to influence our political views and how they are doing it. We must have explicit disclosure about the operation of dominant digital media platforms — including:

  • Real-time and archived information about targeted political advertising;
  • Clear accountability for the social impact of automated decision-making;
  • Explicit indicators for the presence of non-human accounts in digital media.

Privacy: As individuals with the right to personal autonomy, we must be given more control over how our data is collected, used, and monetized — especially when it comes to sensitive information that shapes political decision-making. A baseline data privacy law must include:

  • Consumer control over data through stronger rights to access and removal;
  • Transparency for the user of the full extent of data usage and meaningful consent;
  • Stronger enforcement with resources and authority for agency rule-making.

Competition: As consumers, we must have meaningful options to find, send and receive information over digital media. The rise of dominant digital platforms demonstrates how market structure influences social and political outcomes. A new competition policy agenda should include:

  • Stronger oversight of mergers and acquisitions;
  • Antitrust reform including new enforcement regimes, levies, and essential services regulation;
  • Robust data portability and interoperability between services.

There are no single-solution approaches to the problem of digital disinformation that are likely to change outcomes. … Awareness and education are the first steps toward organizing and action to build a new social contract for digital democracy….(More)”

The Three Goals and Five Functions of Data Stewards


Medium Article by Stefaan G. Verhulst: “…Yet even as we see more data steward-type roles defined within companies, there exists considerable confusion about just what they should be doing. In particular, we have noticed a tendency to conflate the roles of data stewards with those of individuals or groups who might be better described as chief privacy, chief data or security officers. This slippage is perhaps understandable, but our notion of the role is somewhat broader. While privacy and security are of course key components of trusted and effective data collaboratives, the real goal is to leverage private data for broader social goals — while preventing harm.

So what are the necessary attributes of data stewards? What are their roles, responsibilities, and goals of data stewards? And how can they be most effective, both as champions of sharing within organizations and as facilitators for leveraging data with external entities? These are some of the questions we seek to address in our current research, and below we outline some key preliminary findings.

The following “Three Goals” and “Five Functions” can help define the aspirations of data stewards, and what is needed to achieve the goals. While clearly only a start, these attributes can help guide companies currently considering setting up sharing initiatives or establishing data steward-like roles.

The Three Goals of Data Stewards

  • Collaborate: Data stewards are committed to working and collaborating with others, with the goal of unlocking the inherent value of data when a clear case exists that it serves the public good and that it can be used in a responsible manner.
  • Protect: Data stewards are committed to managing private data ethically, which means sharing information responsibly, and preventing harm to potential customers, users, corporate interests, the wider public and of course those individuals whose data may be shared.
  • Act: Data stewards are committed to pro-actively acting in order to identify partners who may be in a better position to unlock value and insights contained within privately held data.

…(More)”.

Rohingya turn to blockchain to solve identity crisis


Skot Thayer and Alex Hern at the Guardian: “Rohingya refugees are turning to blockchain-type technology to help address one of their most existential threats: lack of officially-recognised identity.

Denied citizenship in their home country of Myanmar for decades, the Muslim minority was the target of a brutal campaign of violence by the military which culminated a year ago this week. A “clearance operation” led by Buddhist militia sent more than 700,000 Rohingya pouring over the border into Bangladesh, without passports or official ID.

The Myanmar government has since agreed to take the Rohingya back, but are refusing to grant them citizenship. Many Rohingya do not want to return and face life without a home or an identity. This growing crisis prompted Muhammad Noor and his team at the Rohingya Project to try to find a digital solution.

“Why does a centralised entity like a bank or government own my identity,” says Noor, a Rohingya community leader based in Kuala Lumpur. “Who are they to say if I am who I am?”

Using blockchain-based technology, Noor, is trialling the use of digital identity cards that aim to help Rohingya in Malaysia, Bangladesh and Saudi Arabia access services such as banking and education. The hope is that successful trials might lead to a system that can help the community across southeast Asia.

Under the scheme, a blockchain database is used to record individual digital IDs, which can then be issued to people once they have taken a test to verify that they are genuine Rohingya….

Blockchain-based initiatives, such as the Rohingya Project, could eventually allow people to build the network of relationships necessary to participate in the modern global economy and prevent second and third generation “invisible” people from slipping into poverty. It could also allow refugees to send money across borders, bypassing high transaction fees.

In Jordan’s Azraq refugee camp, the United Nations World Food Programme (WFP) is using blockchain and biometrics to help Syrian refugees to purchase groceries using a voucher system. This use of the technology allows the WFP to bypass bank fees.

But Al Rjula says privacy is still an issue. “The technology is maturing, yet implementation by startups and emerging tech companies is still lacking,” he says.

The involvement of a trendy technology such as blockchains can often be enough to secure the funding, attention and support that start-ups – whether for-profit or charitable – need to thrive. But companies such as Tykn still have to tackle plenty of the same issues as their old-fashioned database-using counterparts, from convincing governments and NGOs to use their services in the first place to working out how to make enough overhead to pay staff, while also dealing with the fickle issues of building on a cutting-edge platform.

Blockchain-based humanitarian initiatives will also need to reckon with the problem of accountability in their efforts to aid refugees and those trapped in the limbo of statelessness.

Dilek Genc, a PhD candidate at the University of Edinburgh who studies blockchain-type applications in humanitarian aid and development, saysif the aid community continues to push innovation using Silicon Valley’s creed of “fail fast and often,” and experiment on vulnerable peoples they will be fundamentally at odds with humanitarian principles and fail to address the political roots of issues facing refugees…(More)”.

Resource Guide to Data Governance and Security


National Neighborhood Indicators Partnership (NNIP): “Any organization that collects, analyzes, or disseminates data should establish formal systems to manage data responsibly, protect confidentiality, and document data files and procedures. In doing so, organizations will build a reputation for integrity and facilitate appropriate interpretation and data sharing, factors that contribute to an organization’s long-term sustainability.

To help groups improve their data policies and practices, this guide assembles lessons from the experiences of partners in the National Neighborhood Indicators Partnership network and similar organizations. The guide presents advice and annotated resources for the three parts of a data governance program: protecting privacy and human subjects, ensuring data security, and managing the data life cycle. While applicable for non-sensitive data, the guide is geared for managing confidential data, such as data used in integrated data systems or Pay-for-Success programs….(More)”.

Is the Government More Entrepreneurial Than You Think?


 Freakonomics Radio (Podcast): We all know the standard story: our economy would be more dynamic if only the government would get out of the way. The economist Mariana Mazzucato says we’ve got that story backward. She argues that the government, by funding so much early-stage research, is hugely responsible for big successes in tech, pharma, energy, and more. But the government also does a terrible job in claiming credit — and, more important, getting a return on its investment….

Quote:

MAZZUCATO: “…And I’ve been thinking about this especially around the big data and the kind of new questions around privacy with Facebook, etc. Instead of having a situation where all the data basically gets captured, which is citizens’ data, by companies which then, in some way, we have to pay into in terms of accessing these great new services — whether they’re free or not, we’re still indirectly paying. We should have the data in some sort of public repository because it’s citizens’ data. The technology itself was funded by the citizens. What would Uber be without GPS, publicly financed? What would Google be without the Internet, publicly financed? So, the tech was financed from the state, the citizens; it’s their data. Why not completely reverse the current relationship and have that data in a public repository which companies actually have to pay into to get access to it under certain strict conditions which could be set by an independent advisory council?… (More)”

How Smart Should a City Be? Toronto Is Finding Out


Laura Bliss at CityLab: “A data-driven “neighborhood of the future” masterminded by a Google corporate sibling, the Quayside project could be a milestone in digital-age city-building. But after a year of scandal in Silicon Valley, questions about privacy and security remain…

Quayside was billed as “the world’s first neighborhood built from the internet up,” according to Sidewalk Labs’ vision plan, which won the RFP to develop this waterfront parcel. The startup’s pitch married “digital infrastructure” with an utopian promise: to make life easier, cheaper, and happier for Torontonians.

Everything from pedestrian traffic and energy use to the fill-height of a public trash bin and the occupancy of an apartment building could be counted, geo-tagged, and put to use by a wifi-connected “digital layer” undergirding the neighborhood’s physical elements. It would sense movement, gather data, and send information back to a centralized map of the neighborhood. “With heightened ability to measure the neighborhood comes better ways to manage it,” stated the winning document. “Sidewalk expects Quayside to become the most measurable community in the world.”

“Smart cities are largely an invention of the private sector—an effort to create a market within government,” Wylie wrote in Canada’s Globe and Mail newspaper in December 2017. “The business opportunities are clear. The risks inherent to residents, less so.” A month later, at a Toronto City Council meeting, Wylie gave a deputation asking officials to “ensure that the data and data infrastructure of this project are the property of the city of Toronto and its residents.”

In this case, the unwary Trojans would be Waterfront Toronto, the nonprofit corporation appointed by three levels of Canadian government to own, manage, and build on the Port Lands, 800 largely undeveloped acres between downtown and Lake Ontario. When Waterfront Toronto gave Sidewalk Labs a green light for Quayside in October, the startup committed $50 million to a one-year consultation, which was recently extended by several months. The plan is to submit a final “Master Innovation and Development Plan” by the end of this year.

That somewhat Orwellian vision of city management had privacy advocates and academics concerned from the the start. Bianca Wylie, the co-founder of the technology advocacy group Tech Reset Canada, has been perhaps the most outspoken of the project’s local critics. For the last year, she’s spoken up at public fora, written pointed op-edsand Medium posts, and warned city officials of what she sees as the “Trojan horse” of smart city marketing: private companies that stride into town promising better urban governance, but are really there to sell software and monetize citizen data.

But there has been no guarantee about who would own the data at the core of its proposal—much of which would ostensibly be gathered in public space. Also unresolved is the question of whether this data could be sold. With little transparency about what that means from the company or its partner, some Torontonians are wondering what Waterfront Toronto—and by extension, the public—is giving away….(More)”.

Decentralisation: the next big step for the world wide web


Zoë Corbyn at The Observer: “The decentralised web, or DWeb, could be a chance to take control of our data back from the big tech firms. So how does it work and when will it be here?...What is the decentralised web? 
It is supposed to be like the web you know but without relying on centralised operators. In the early days of the world wide web, which came into existence in 1989, you connected directly with your friends through desktop computers that talked to each other. But from the early 2000s, with the advent of Web 2.0, we began to communicate with each other and share information through centralised services provided by big companies such as Google, Facebook, Microsoft and Amazon. It is now on Facebook’s platform, in its so called “walled garden”, that you talk to your friends. “Our laptops have become just screens. They cannot do anything useful without the cloud,” says Muneeb Ali, co-founder of Blockstack, a platform for building decentralised apps. The DWeb is about re-decentralising things – so we aren’t reliant on these intermediaries to connect us. Instead users keep control of their data and connect and interact and exchange messages directly with others in their network.

Why do we need an alternative? 
With the current web, all that user data concentrated in the hands of a few creates risk that our data will be hacked. It also makes it easier for governments to conduct surveillance and impose censorship. And if any of these centralised entities shuts down, your data and connections are lost. Then there are privacy concerns stemming from the business models of many of the companies, which use the private information we provide freely to target us with ads. “The services are kind of creepy in how much they know about you,” says Brewster Kahle, the founder of the Internet Archive. The DWeb, say proponents, is about giving people a choice: the same services, but decentralised and not creepy. It promises control and privacy, and things can’t all of a sudden disappear because someone decides they should. On the DWeb, it would be harder for the Chinese government to block a site it didn’t like, because the information can come from other places.

How does the DWeb work that is different? 

There are two big differences in how the DWeb works compared to the world wide web, explains Matt Zumwalt, the programme manager at Protocol Labs, which builds systems and tools for the DWeb. First, there is this peer-to-peer connectivity, where your computer not only requests services but provides them. Second, how information is stored and retrieved is different. Currently we use http and https links to identify information on the web. Those links point to content by its location, telling our computers to find and retrieve things from those locations using the http protocol. By contrast, DWeb protocols use links that identify information based on its content – what it is rather than where it is. This content-addressed approach makes it possible for websites and files to be stored and passed around in many ways from computer to computer rather than always relying on a single server as the one conduit for exchanging information. “[In the traditional web] we are pointing to this location and pretending [the information] exists in only one place,” says Zumwalt. “And from this comes this whole monopolisation that has followed… because whoever controls the location controls access to the information.”…(More)”.

The Known Known


Book Review by Sue Halpern in The New York Review of Books of The Known Citizen: A History of Privacy in Modern America by Sarah E. Igo; Habeas Data: Privacy vs. the Rise of Surveillance Tech by Cyrus Farivar;  Beyond Abortion: Roe v. Wade and the Battle for Privacy by Mary Ziegler; Privacy’s Blueprint: The Battle to Control the Design of New Technologies by Woodrow Hartzog: “In 1999, when Scott McNealy, the founder and CEO of Sun Microsystems, declared, “You have zero privacy…get over it,” most of us, still new to the World Wide Web, had no idea what he meant. Eleven years later, when Mark Zuckerberg said that “the social norms” of privacy had “evolved” because “people [had] really gotten comfortable not only sharing more information and different kinds, but more openly and with more people,” his words expressed what was becoming a common Silicon Valley trope: privacy was obsolete.

By then, Zuckerberg’s invention, Facebook, had 500 million users, was growing 4.5 percent a month, and had recently surpassed its rival, MySpace. Twitter had overcome skepticism that people would be interested in a zippy parade of 140-character posts; at the end of 2010 it had 54 million active users. (It now has 336 million.) YouTube was in its fifth year, the micro-blogging platform Tumblr was into its third, and Instagram had just been created. Social media, which encouraged and relied on people to share their thoughts, passions, interests, and images, making them the Web’s content providers, were ascendant.

Users found it empowering to bypass, and even supersede, the traditional gatekeepers of information and culture. The social Web appeared to bring to fruition the early promise of the Internet: that it would democratize the creation and dissemination of knowledge. If, in the process, individuals were uploading photos of drunken parties, and discussing their sexual fetishes, and pulling back the curtain on all sorts of previously hidden personal behaviors, wasn’t that liberating, too? How could anyone argue that privacy had been invaded or compromised or effaced when these revelations were voluntary?

The short answer is that they couldn’t. And they didn’t. Users, who in the early days of social media were predominantly young, were largely guileless and unconcerned about privacy. In a survey of sixty-four of her students at Rochester Institute of Technology in 2006, Susan Barnes found that they “wanted to keep information private, but did not seem to realize that Facebook is a public space.” When a random sample of young people was asked in 2007 by researchers from the Pew Research Center if “they had any concerns about publicly posted photos, most…said they were not worried about risks to their privacy.” (This was largely before Facebook and other tech companies began tracking and monetizing one’s every move on- and offline.)

In retrospect, the tendencies toward disclosure and prurience online should not have been surprising….(More)”.

Long Term Info-structure


Long Now Foundation Seminar by Juan Benet: “We live in a spectacular time,”…”We’re a century into our computing phase transition. The latest stages have created astonishing powers for individuals, groups, and our species as a whole. We are also faced with accumulating dangers — the capabilities to end the whole humanity experiment are growing and are ever more accessible. In light of the promethean fire that is computing, we must prevent bad outcomes and lock in good ones to build robust foundations for our knowledge, and a safe future. There is much we can do in the short-term to secure the long-term.”

“I come from the front lines of computing platform design to share a number of new super-powers at our disposal, some old challenges that are now soluble, and some new open problems. In this next decade, we’ll need to leverage peer-to-peer networks, crypto-economics, blockchains, Open Source, Open Services, decentralization, incentive-structure engineering, and so much more to ensure short-term safety and the long-term flourishing of humanity.”

Juan Benet is the inventor of the InterPlanetary File System (IPFS)—a new protocol which uses content-addressing to make the web faster, safer, and more open—and the creator of Filecoin, a cryptocurrency-incentivized storage market….(More + Video)”

Making a Smart City a Fairer City: Chicago’s Technologists Address Issues of Privacy, Ethics, and Equity, 2011-2018


Case study by Gabriel Kuris and Steven S. Strauss at Innovations for Successful Societies: “In 2011, voters in Chicago elected Rahm Emanuel, a 51-year-old former Chicago congressman, as their new mayor. Emanuel inherited a city on the upswing after years of decline but still marked by high rates of crime and poverty, racial segregation, and public distrust in government. The Emanuel administration hoped to harness the city’s trove of digital data to improve Chicagoans’ health, safety, and quality of life. During the next several years, Chief Data Officer Brett Goldstein and his successor Tom Schenk led innovative uses of city data, ranging from crisis management to the statistical targeting of restaurant inspections and pest extermination. As their teams took on more-sophisticated projects that predicted lead-poisoning risks and Escherichia coli outbreaks and created a citywide network of ambient sensors, the two faced new concerns about normative issues like privacy, ethics, and equity. By 2018, Chicago had won acclaim as a smarter city, but was it a fairer city? This case study discusses some of the approaches the city developed to address those challenges and manage the societal implications of cutting-edge technologies….(More)”.