Identity in the Decentralized Web


Blog by Jim Nelson: “The idea is that web sites will verify you much as a bartender checks your ID before pouring a drink.  The bar doesn’t store a copy of your card and the bartender doesn’t look at your name or address; only your age is pertinent to receive service.  The next time you enter the bar the bartender once again asks for proof of age, which you may or may not relinquish. That’s the promise of self-sovereign identity.

At the Decentralized Web Summit, questions and solutions were bounced around in the hopes of solving this fundamental problem.  Developers spearheading the next web hashed out the criteria for decentralized identity, including:

  • secure: to prevent fraud, maintain privacy, and ensure trust between all parties
  • self-sovereign: individual ownership of private information
  • consent: fine-tuned control over what information third-parties are privy to
  • directed identity: manage multiple identities for different contexts (for example, your doctor can access certain aspects while your insurance company accesses others)
  • and, of course, decentralized: no central authority or governing body holds private keys or generates identifiers

One problem with decentralized identity is that these problems often compete, pulling in polar directions.

Courtesy of Jolocom

For example, while security seems like a no-brainer, with self-sovereign identity the end-user is in control (and not Facebook, Google, or Twitter).  It’s incumbent on them to secure their information. This raises questions of key management, data storage practices, and so on. Facebook, Google, and Twitter pay full-time engineers to do this job; handing that responsibility to end-users shifts the burden to someone who may not be so technically savvy.  The inconvenience of key management and such also creates more hurdles for widespread adoption of the decentralized web.

The good news is, there are many working proposals today attempting to solve the above problems.  One of the more promising is DID (Decentralized Identifier).

A DID is simply a URI, a familiar piece of text to most people nowadays.  Each DID references a record stored in a blockchain. DIDs are not tied to any particular blockchain, and so they’re interoperable with existing and future technologies.  DIDs are cryptographically secure as well.

DIDs require no central authority to produce or validate.  If you want a DID, you can generate one yourself, or as many was you want.  In fact, you should generate lots of them.  Each unique DID gives the user fine-grained control over what personal information is revealed when interacting with a myriad of services and people.

If you’re interested to learn more, I recommend reading Michiel Mulders’ article on DIDs, “the Internet’s ‘missing identity layer’.”  The DID working technical specification is being developed by the W3C.  And those looking for code and community, check out the Decentralized Identity Foundation…(More)”.

Age of the expert as policymaker is coming to an end


Wolfgang Münchau at the Financial Times: “…Where the conflation of the expert and the policymaker did real damage was not to policy but to expertdom itself. It compromised the experts’ most prized asset — their independence.

When economics blogging started to become fashionable, I sat on a podium with an academic blogger who predicted that people like him would usurp the role of the economics newspaper columnist within a period of 10 years. That was a decade ago. His argument was that trained economists were just smarter. What he did not reckon with is that it is hard to speak truth to power when you have to beg that power to fund your think-tank or institute. E

ven less so once you are politically attached or appointed. Independence matters. A good example of a current issue where lack of independence gets in the way is the debate on cryptocurrencies. I agree that governments should not lightly concede the money monopoly of the state, which is at the heart of our economic system. But I sometimes wonder whether those who hyperventilate about crypto do so because they find the whole concept offensive. Cryptocurrencies embody a denial of economics. There are no monetary policy committees. Cryptocurrencies may, or may not, damage the economy. But they surely damage the economist.

Even the best arguments lose power when they get mixed up with personal interests. If you want to be treated as an independent authority, do not join a policy committee, or become a minister or central banker. As soon as you do, you have changed camps. You may think of yourself as an expert. The rest of the world does not. The minimum needed to maintain or regain credibility is to state conflicts of interests openly. The only option in such cases is to be transparent. This is also why financial journalists have to declare the shares they own. The experts I listen to are those who are independent, and who do not follow a political agenda. The ones I avoid are the zealots and those who wander off their reservation and make pronouncements without inhibition. An economist may have strong views on the benefits of vaccination, for example, but is still no expert on the subject. And I often cringe when I hear a doctor trying to prove a point by using statistics. The world will continue to need policymakers and the experts who advise them. But more than that, it needs them to be independent….(More)”.

Soon, satellites will be able to watch you everywhere all the time


Christopher Beam at MIT Technology Review: “In 2013, police in Grants Pass, Oregon, got a tip that a man named Curtis W. Croft had been illegally growing marijuana in his backyard. So they checked Google Earth. Indeed, the four-month-old satellite image showed neat rows of plants growing on Croft’s property. The cops raided his place and seized 94 plants.

In 2018, Brazilian police in the state of Amapá used real-time satellite imagery to detect a spot where trees had been ripped out of the ground. When they showed up, they discovered that the site was being used to illegally produce charcoal, and arrested eight people in connection with the scheme.

Chinese government officials have denied or downplayed the existence of Uighur reeducation camps in Xinjiang province, portraying them as “vocational schools.” But human rights activists have used satellite imagery to show that many of the “schools” are surrounded by watchtowers and razor wire.

Every year, commercially available satellite images are becoming sharper and taken more frequently. In 2008, there were 150 Earth observation satellites in orbit; by now there are 768. Satellite companies don’t offer 24-hour real-time surveillance, but if the hype is to be believed, they’re getting close. Privacy advocates warn that innovation in satellite imagery is outpacing the US government’s (to say nothing of the rest of the world’s) ability to regulate the technology. Unless we impose stricter limits now, they say, one day everyone from ad companies to suspicious spouses to terrorist organizations will have access to tools previously reserved for government spy agencies. Which would mean that at any given moment, anyone could be watching anyone else.

The images keep getting clearer

Commercial satellite imagery is currently in a sweet spot: powerful enough to see a car, but not enough to tell the make and model; collected frequently enough for a farmer to keep tabs on crops’ health, but not so often that people could track the comings and goings of a neighbor. This anonymity is deliberate. US federal regulations limit images taken by commercial satellites to a resolution of 25 centimeters, or about the length of a man’s shoe….(More)”.

Google and the University of Chicago Are Sued Over Data Sharing


Daisuke Wakabayashi in The New York Times: “When the University of Chicago Medical Center announced a partnership to share patient data with Google in 2017, the alliance was promoted as a way to unlock information trapped in electronic health records and improve predictive analysis in medicine.

On Wednesday, the University of Chicago, the medical center and Google were sued in a potential class-action lawsuit accusing the hospital of sharing hundreds of thousands of patients’ records with the technology giant without stripping identifiable date stamps or doctor’s notes.

The suit, filed in United States District Court for the Northern District of Illinois, demonstrates the difficulties technology companies face in handling health data as they forge ahead into one of the most promising — and potentially lucrative — areas of artificial intelligence: diagnosing medical problems.

Google is at the forefront of an effort to build technology that can read electronic health records and help physicians identify medical conditions. But the effort requires machines to learn this skill by analyzing a vast array of old health records collected by hospitals and other medical institutions.

That raises privacy concerns, especially when is used by a company like Google, which already knows what you search for, where you are and what interests you hold.

In 2016, DeepMind, a London-based A.I. lab owned by Google’s parent company, Alphabet, was accused of violating patient privacy after it struck a deal with Britain’s National Health Service to process medical data for research….(More)”.

How Ideas and Institutions Shape the Politics of Public Policy


Book by Daniel Béland :”…provides a critical review of existing literature on the role of ideas and institutions in the politics of public policy with the aim of contributing to the study of the politics of public policy. Because most policy scholars deal with the role of ideas or institutions in their research, such a critical review should help them improve their knowledge of crucial analytical issues in policy and political analysis. The following discussion brings together insights from both the policy studies literature and new institutionalism in sociology and political science, and stresses the explanatory role of ideas and institutions….(More)”.

Open Mobility Foundation


Press Release: “The Open Mobility Foundation (OMF) – a global coalition led by cities committed to using well-designed, open-source technology to evolve how cities manage transportation in the modern era – launched today with the mission to promote safety, equity and quality of life. The announcement comes as a response to the growing number of vehicles and emerging mobility options on city streets. A new city-governed non-profit, the OMF brings together academic, commercial, advocacy and municipal stakeholders to help cities develop and deploy new digital mobility tools, and provide the governance needed to efficiently manage them.

“Cities are always working to harness the power of technology for the public good. The Open Mobility Foundation will help us manage emerging transportation infrastructures, and make mobility more accessible and affordable for people in all of our communities,” said Los Angeles Mayor Eric Garcetti, who also serves as Advisory Council Chair of Accelerator for America, which showcased the MDS platform early on.

The OMF convenes a new kind of public-private forum to seed innovative ideas and govern an evolving software platform. Serving as a forum for discussions about pedestrian safety, privacy, equity, open-source governance and other related topics, the OMF has engaged a broad range of city and municipal organizations, private companies and non-profit groups, and experts and advocates to ensure comprehensive engagement and expertise on vital issues….

The OMF governs a platform called “Mobility Data Specification” (MDS) that the Los Angeles Department of Transportation developed to help manage dockless micro-mobility programs (including shared dockless e-scooters). MDS is comprised of a set of Application Programming Interfaces (APIs) that create standard communications between cities and private companies to improve their operations. The APIs allow cities to collect data that can inform real-time traffic management and public policy decisions to enhance safety, equity and quality of life. More than 50 cities across the United States – and dozens across the globe – already use MDS to manage micro-mobility services.

Making this software open and free offers a safe and efficient environment for stakeholders, including municipalities, companies, experts and the public, to solve problems together. And because private companies scale best when cities can offer a consistent playbook for innovation, the OMF aims to nurture those services that provide the highest benefit to the largest number of people, from sustainability to safety outcomes….(More)”

Seize the Data: Using Evidence to Transform How Federal Agencies Do Business


Report by the Partnership for Public Service: “The use of data analysis, rigorous evaluation and a range of other credible strategies to inform decision-making is becoming more common across government. Even so, the movement is nascent, with leading practices implemented at some agencies, but not yet widely adopted. Much more progress is necessary. In fact, the recently enacted Foundations for Evidence-Based Policymaking Act, as well as the recently released draft Federal Data Strategy Action Plan, both prioritize broader adoption of leading practices.

To support that effort, this report highlights practical steps that agencies can take to become more data-driven and evidence-based. The findings emerged from a series of workshops and interviews conducted between April 2018 and May 2019 by the Partnership for Public Service and Grant Thornton. From these sessions, we learned that the most forward-thinking agencies rely on multiple approaches, including:
• Using top-down and bottom-up approaches to build evidence-based organizations.
• Driving longer-term and shorter-term learning.
• Using existing data and new data.
• Strengthening internal capacity and creating external research practitioner partnerships.
This report describes what these strategies look like in practice, and shares real-world and replicable examples of how leading agencies have become more data-driven and evidence-based….(More)”.

Challenges in using data across government


National Audit Office (UK): “Data is crucial to the way government delivers services for citizens, improves its own systems and processes, and makes decisions. Our work has repeatedly highlighted the importance of evidence-based decision-making at all levels of government activity, and the problems that arise when data is inadequate.

Government recognises the value of using data more effectively, and the importance of ensuring security and public trust in how it is used. It plans to produce a new national data strategy in 2020 to position “the UK as a global leader on data, working collaboratively and openly across government”.

To achieve its ambitions government will need to resolve fundamental challenges around how to use and share data safely and appropriately, and how to balance competing demands on public resources in a way that allows for sustained but proportionate investment in data. The future national data strategy provides the government with an opportunity to do this, building on the renewed interest and focus on the use of data within government and beyond.

Content and scope of the report

This report sets out the National Audit Office’s experience of data across government, including initial efforts to start to address the issues. From our past work we have identified three areas where government needs to establish the pre-conditions for success: clear strategy and leadership; a coherent infrastructure for managing data; and broader enablers to safeguard and support the better use of data. In this report we consider:

  • the current data landscape across government (Part One);
  • how government needs a clear plan and leadership to improve its use of data (Part Two);
  • the quality, standards and systems needed to use data effectively (Part Three); and
  • wider conditions and enablers for success (Part Four).

Concluding remarks

Past examples such as Windrush and Carer’s Allowance show how important good‑quality data is, and the consequences if not used well. Without accurate, timely and proportionate data, government will not be able get the best use out of public money or take the next step towards more sophisticated approaches to using data that can reap real rewards.

But despite years of effort and many well-documented failures, government has lacked clear and sustained strategic leadership on data. This has led to departments under-prioritising their own efforts to manage and improve data. There are some early signs that the situation is improving, but unless government uses the data strategy to push a sea change in strategy and leadership, it will not get the right processes, systems and conditions in place to succeed, and this strategy will be yet another missed opportunity….(More)”.

Modernizing Congress: Bringing Democracy into the 21st Century


Report by Lorelei Kelly: “Congress represents a national cross section of civic voice. It is potentially the most diverse market for ideas in government and should be reaping the benefits of America’s creativity and knowledge. During our transition into the 21st century, this civic information asset — from lived experience to structured data — should fuel the digital infrastructure of a modern representative system. Yet Congress has thus far failed to tap this resource on behalf of its legislative and deliberative functions.

Today’s Congress can’t compete on digital infrastructure or modern data methods with the executive branch, the media or the private sector. To be sure, information weaponization, antique technology and Congress’ stubborn refusal to fund itself has arrested its development of a digital infrastructure. Congress is knowledge incapacitated, physically disconnected and technologically obsolete. In this condition, it cannot fulfill its First Branch duties as laid out in Article I of the U.S. Constitution.

Fortunately, changing the direction of Congress is now in sight. Before the end of January 2019, (1) the Foundations for Evidence-Based Policymaking Act became law, (2) the House created a Select Committee on Modernization, and (3) Congress began to restore its internal science and technology capacity.

Modernizing Congress lays out a plan to accelerate this institutional progress. It scopes out the challenge of including civic voice in the legislative and deliberative process. It then identifies trusted local information intermediaries who could act as key components of a modern knowledge commons in Congress. With three case studies, the report illustrates how members and staff are finding new ways to build connection and gather useful constituent input at the district level. The report explores an urban, rural and suburban district. It concludes that while individual members are leveraging technology to connect and use new forms of civic voice from constituents, what Congress needs most is a systemwide digital infrastructure and updated institutional standards for data collection….(More)”.

Information, Technology and Control in a Changing World: Understanding Power Structures in the 21st Century


Book edited by Blayne Haggart, Kathryn Henne, and Natasha Tusikov: “This book explores the interconnected ways in which the control of knowledge has become central to the exercise of political, economic, and social power. Building on the work of International Political Economy scholar Susan Strange, this multidisciplinary volume features experts from political science, anthropology, law, criminology, women’s and gender studies, and Science and Technology Studies, who consider how the control of knowledge is shaping our everyday lives. From “weaponised copyright” as a censorship tool, to the battle over control of the internet’s “guts,” to the effects of state surveillance at the Mexico–U.S. border, this book offers a coherent way to understand the nature of power in the twenty-first century…(More)”.