Don’t Fight Regulation. Reprogram It


Article by Alison Kutler and Antonio Sweet: “Businesspeople too often assume that the relationship between government and the private sector is (and should be) adversarial. They imagine two opposing forces, each setting their bounds of control. But if you can envision government and business as platforms that interact with one other, it becomes apparent why the word code applies to both technology and law. A successful business leader works with regulation the way a successful app developer works with another company’s operating system: testing it, providing innovative ways to get results within the system’s constraints, and offering guidance, where possible, to help make the system more efficient, more fair, and more valuable to end-users.

Like the computer language of an operating system, legal and regulatory codes follow rules designed to make them widely recognizable to those who are appropriately trained. As legislators, regulators, and other officials write that code, they seek input from stakeholders through hearings and public-comment filings on proposed rules. Policymakers rely on constituents, public filings, and response analysis the way software designers use beta testers, crash reports, and developer feedback — to debug and validate code before deploying it across the entire system.

Unfortunately, policymakers and business leaders don’t always embrace what software developers know about collaborative innovation. Think about how much less a smartphone could do if its manufacturers never worked closely with people outside of their engineering department. When only a small subset of voices are involved, the final code only reflects the needs of the most vocal groups. As a result, the unengaged are stuck with a system that doesn’t take into account their needs, or worse, disables their product.

Policymakers may also benefit by emulating the kind of interoperability that makes software effective. When enterprise systems are too different from each other, people struggle with system unfamiliarity. They also run into interoperability issues when trying to function across multiple systems. A product development team can devote massive amounts of resources to designing and building something to work perfectly in one operating system domain, only to have it slow down or completely freeze in another…(More)”.

The Structure of Bias


Paper by  Gabbrielle M Johnson: “What is a bias? Standard philosophical views of both implicit and explicit bias focus this question on the representations one harbors, e.g., stereotypes or implicit attitudes, rather than the ways in which those representations (or other mental states) are manipulated. I call such views representationalism.

In this paper, I argue that representationalism about bias is a mistake because it conceptualizes social bias in ways that do not fully capture the phenomenon. Crucially, such views fail to capture a heretofore neglected possibility of bias: one that influences an individual’s beliefs about and actions toward other people, but is, nevertheless, nowhere represented in that individual’s cognitive repertoire.

In place of representationalism, I develop a functional account of bias that treats it as a mental entity that takes propositional mental states as inputs and returns propositional mental states as outputs in a way that instantiates, or at the very least mimics, inferences on the basis of an individual’s social group membership. This functional characterization leaves open which mental states and processes bridge the gap between the inputs and outputs, ultimately highlighting the diversity of candidates that can serve this role….(More)”.

Information to Action: Strengthening EPA Citizen Science Partnerships for Environmental Protection


Report by the National Advisory Council for Environmental Policy and Technology: “Citizen science is catalyzing collaboration; new data and information brought about by greater public participation in environmental research are helping to drive a new era of environmental protection. As the body of citizen-generated data and information in the public realm continues to grow, EPA must develop a clear strategy to lead change and encourage action beyond the collection of data. EPA should recognize the variety of opportunities that it has to act as a conduit between the public and key partners, including state, territorial, tribal and local governments; nongovernmental organizations; and leading technology groups in the private sector. The Agency should build collaborations with new partners, identify opportunities to integrate equity into all relationships, and ensure that grassroots and community-based organizations are well supported and fairly resourced in funding strategies.

Key recommendations under this theme:

  • Recommendation 1. Catalyze action from citizen science data and information by providing guidance and leveraging collaboration.
  • Recommendation 2. Build inclusive and equitable partnerships by understanding partners’ diverse concerns and needs, including prioritizing better support for grassroots and community-based partnerships in EPA grantfunding strategies.

Increase state, territorial, tribal and local government engagement with citizen science

The Agency should reach out to tribes, states, territories and local governments throughout the country to understand the best practices and strategies for encouraging and incorporating citizen science in environmental protection. For states and territories looking for ways to engage in citizen science, EPA can help design strategies that recognize the community perspectives while building capacity in state and territorial governments. Recognizing the direct Executive Summary Information to Action: Strengthening EPA Citizen Science Partnerships for Environmental Protection connection between EPA and tribes, the Agency should seek tribal input and support tribes in using citizen science for environmental priorities. EPA should help to increase awareness for citizen science and where jurisdictional efforts already exist, assist in making citizen science accessible through local government agencies. EPA should more proactively listen to the voices of local stakeholders and encourage partners to embrace a vision for citizen science to accelerate the achievement of environmental goals. As part of this approach, EPA should find ways to define and communicate the Agency’s role as a resource in helping communities achieve environmental outcomes.

Key recommendations under this theme:

  • Recommendation 3. Provide EPA support and engage states and territories to better integrate citizen science into program goals.
  • Recommendation 4. Build on the unique strengths of EPA-tribal relationships.
  • Recommendation 5. Align EPA citizen science work to the priorities of local governments.

Leverage external organizations for expertise and project level support

Collaborations between communities and other external organizations—including educational institutions, civic organizations, and community-based organizations— are accelerating the growth of citizen science. Because EPA’s direct connection with members of the public often is limited, the Agency could benefit significantly by consulting with key external organizations to leverage citizen science efforts to provide the greatest benefit for the protection of human health and the environment. EPA should look to external organizations as vital connections to communities engaged in collaboratively led scientific investigation to address community-defined questions, referred to as community citizen science. External organizations can help EPA in assessing gaps in community-driven research and help the Agency to design effective support tools and best management practices for facilitating effective environmental citizen science programs….(More)”.

4 reasons why Data Collaboratives are key to addressing migration


Stefaan Verhulst and Andrew Young at the Migration Data Portal: “If every era poses its dilemmas, then our current decade will surely be defined by questions over the challenges and opportunities of a surge in migration. The issues in addressing migration safely, humanely, and for the benefit of communities of origin and destination are varied and complex, and today’s public policy practices and tools are not adequate. Increasingly, it is clear, we need not only new solutions but also new, more agile, methods for arriving at solutions.

Data are central to meeting these challenges and to enabling public policy innovation in a variety of ways. Yet, for all of data’s potential to address public challenges, the truth remains that most data generated today are in fact collected by the private sector. These data contains tremendous possible insights and avenues for innovation in how we solve public problems. But because of access restrictions, privacy concerns and often limited data science capacity, their vast potential often goes untapped.

Data Collaboratives offer a way around this limitation.

Data Collaboratives: A new form of Public-Private Partnership for a Data Age

Data Collaboratives are an emerging form of partnership, typically between the private and public sectors, but often also involving civil society groups and the education sector. Now in use across various countries and sectors, from health to agriculture to economic development, they allow for the opening and sharing of information held in the private sector, in the process freeing data silos up to serve public ends.

Although still fledgling, we have begun to see instances of Data Collaboratives implemented toward solving specific challenges within the broad and complex refugee and migrant space. As the examples we describe below suggest (which we examine in more detail Stanford Social Innovation Review), the use of such Collaboratives is geographically dispersed and diffuse; there is an urgent need to pull together a cohesive body of knowledge to more systematically analyze what works, and what doesn’t.

This is something we have started to do at the GovLab. We have analyzed a wide variety of Data Collaborative efforts, across geographies and sectors, with a goal of understanding when and how they are most effective.

The benefits of Data Collaboratives in the migration field

As part of our research, we have identified four main value propositions for the use of Data Collaboratives in addressing different elements of the multi-faceted migration issue. …(More)”,

Most Maps of the New Ebola Outbreak Are Wrong


Ed Kong in The Atlantic: “Almost all the maps of the outbreak zone that have thus far been released contain mistakes of this kind. Different health organizations all seem to use their own maps, most of which contain significant discrepancies. Things are roughly in the right place, but their exact positions can be off by miles, as can the boundaries between different regions.

Sinai, a cartographer at UCLA, has been working with the Ministry of Health to improve the accuracy of the Congo’s maps, and flew over on Saturday at their request. For each health zone within the outbreak region, Sinai compiled a list of the constituent villages, plotted them using the most up-to-date sources of geographical data, and drew boundaries that include these places and no others. The maps at the top of this piece show the before (left) and after (right) images….

Consider Bikoro, the health zone where the outbreak may have originated, and where most cases are found. Sinai took a list of all Bikoro’s villages, plotted them using the most up-to-date sources of geographical data, and drew a boundary that includes these places and no others. This new shape is roughly similar to the one on current maps, but with critical differences. Notably, existing maps have the village of Ikoko Impenge—one of the epicenters of the outbreak—outside the Bikoro health zone, when it actually lies within the zone.

 “These visualizations are important for communicating the reality on the ground to all levels of the health hierarchy, and to international partners who don’t know the country,” says Mathias Mossoko, the head of disease surveillance data in DRC.

“It’s really important for the outbreak response to have real and accurate data,” adds Bernice Selo, who leads the cartographic work from the Ministry of Health’s command center in Kinshasa. “You need to know exactly where the villages are, where the health facilities are, where the transport routes and waterways are. All of this helps you understand where the outbreak is, where it’s moving, how it’s moving. You can see which villages have the highest risk.”

To be clear, there’s no evidence that these problems are hampering the response to the current outbreak. It’s not like doctors are showing up in the middle of the forest, wondering why they’re in the wrong place. “Everyone on the ground knows where the health zones start and end,” says Sinai. “I don’t think this will make or break the response. But you surely want the most accurate data.”

It feels unusual to not have this information readily at hand, especially in an era when digital maps are so omnipresent and so supposedly truthful. If you search for San Francisco on Google Maps, you can be pretty sure that what comes up is actually where San Francisco is. On Google Street View, you can even walk along a beach at the other end of the world….(More)”.

But the Congo is a massive country—a quarter the size of the United States with considerably fewer resources. Until very recently, they haven’t had the resources to get accurate geolocalized data. Instead, the boundaries of the health zones and their constituent “health areas,” as well as the position of specific villages, towns, rivers, hospitals, clinics, and other landmarks, are often based on local knowledge and hand-drawn maps. Here’s an example, which I saw when I visited the National Institute for Biomedical Research in March. It does the job, but it’s clearly not to scale.

Citizen-generated evidence for a more sustainable and healthy food system


Research Report by Bill Vorley:  “Evidence generation by and with low-income citizens is particularly important if policy makers are to improve understanding of people’s diets and the food systems they use, in particular the informal economy. The informal food economy is the main route for low-income communities to secure their food, and is an important source of employment, especially for women and youth. The very nature of informality means that the realities of poor people’s lives are often invisible to policymakers. This invisibility is a major factor in exclusion and results in frequent mismatches between policy and local realities. This paper focuses on citizen-generated evidence as a means for defending and improving the food system of the poor. It clearly outlines a range of approaches to citizen-generated evidence including primary data collection and citizen access to and use of existing information….(More)”.

Blockchain as a force for good: How this technology could transform the sharing economy


Aaron Fernando at Shareable: “The volatility in the price of cryptocurrencies doesn’t matter to restaurateur Helena Fabiankovic, who started Baba’s Pierogies in Brooklyn with her partner Robert in 2015. Yet she and her business are already positioned to reap the real-world benefits of the technology that underpins these digital currencies — the blockchain — and they will be at the forefront  of a sustainable, community-based peer-to-peer energy revolution because of it.

So what does a restaurateur have to do with the blockchain and local energy? Fabiankovic is one of the early participants in the Brooklyn Microgrid, a project of the startup LO3 Energy that uses a combination of innovative technologies — blockchain and smart meters — to operate a virtual microgrid in the borough of Brooklyn in New York City, New York. This microgrid enables residents to buy and sell green energy directly to their neighbors at much better rates than if they only interacted with centralized utility providers.

Just as we don’t pay much attention to the critical infrastructure that powers our digital world and exists just out of sight — from the Automated Clearing House (ACH), which undergirds our financial system, to the undersea cables that enable the Internet to be globally useful, blockchain is likely to change our lives in ways that will eventually be invisible. In the sharing economy, we have traditionally just used existing infrastructure and built platforms and services on top of it. Considering that those undersea cables are owned by private companies with their own motives and that the locations of ACH data centers are heavily classified, there is a lot to be desired in terms of transparency, resilience, and independence from self-interested third parties. That’s where open-source, decentralized infrastructure of the blockchain for the sharing economy offers much promise and potential.

In the case of Brooklyn Microgrid, which is part of an emerging model for shared energy use via the blockchain, this decentralized infrastructure would allow residents like Fabiankovic to save money and make sustainable choices. Shared ownership and community financing for green infrastructure like solar panels is part of the model. “Everyone can pay a different amount and you can get a proportional amount of energy that’s put off by the panel, based on how much that you own,” says Scott Kessler, director of business development at LO3. “It’s really just a way of crowdfunding an asset.”

The type of blockchain used by the Brooklyn Microgrid makes it possible to collect and communicate data from smart meters every second, so that the price of electricity can be updated in real time and users will still transact with each other using U.S. dollars. The core idea of the Brooklyn Microgrid is to utilize a tailored blockchain to align energy consumption with energy production, and to do this with rapidly-updated price information that then changes behavior around energy….(More)

Open Data Charter Measurement Guide


Guide by Ana Brandusescu and Danny Lämmerhirt: “We are pleased to announce the launch of our Open Data Charter Measurement Guide. The guide is a collaborative effort of the Charter’s Measurement and Accountability Working Group (MAWG). It analyses the Open Data Charter principles and how they are assessed based on current open government data measurement tools. Governments, civil society, journalists, and researchers may use it to better understand how they can measure open data activities according to the Charter principles.

What can I find in the Measurement Guide?

  • An executive summary for people who want to quickly understand what measurement tools exist and for what principles.
  • An analysis of how each Charter principle is measured, including a comparison of indicators that are currently used to measure each Charter principle and its commitments. This analysis is based on the open data indicators used by the five largest measurement tools — the Web Foundation’s Open Data Barometer, Open Knowledge International’s Global Open Data Index, Open Data Watch’s Open Data Inventory, OECD’s OURdata Index, and the European Open Data Maturity Assessment . For each principle, we also highlight case studies of how Charter adopters have practically implemented the commitments of that principle.
  • Comprehensive indicator tables show how each Charter principle commitment can be measured. This table is especially helpful when used to compare how different indices approach the same commitment, and where gaps exist. Here, you can see an example of the indicator tables for Principle 1.
  • A methodology section that details how the Working Group conducted the analysis of mapping existing measurements indices against Charter commitments.
  • A recommended list of resources for anyone that wants to read more about measurement and policy.

The Measurement Guide is available online in the form of a Gitbook and in a printable PDF version

This is your office on AI


Article by Jeffrey Brown at a Special Issue of the Wilson Quarterly on AI: “The future has arrived and it’s your first day at your new job. You step across the threshold sporting a nervous smile and harboring visions of virtual handshakes and brain-computer interfaces. After all, this is one of those newfangled, modern offices that science-fiction writers have been dreaming up for ages. Then you bump up against something with a thud. No, it’s not one of the ubiquitous glass walls, but the harsh reality of an office that, at first glance, doesn’t appear much different from what you’re accustomed to. Your new colleagues shuffle between meetings clutching phones and laptops. A kitchenette stocked with stale donuts lurks in the background. And, by the way, you were fifteen minutes late because the commute is still hell.

So where is the fabled “office of the future”? After all, many of us have only ever fantasized about the ways in which technology – and especially artificial intelligence – might transform our working lives for the better. In fact, the AI-enabled office will usher in far more than next-generation desk supplies. It’s only over subsequent weeks that you come to appreciate how the office of the future feels, operates, and yes, senses. It also slowly dawns on you that work itself has changed and that what it means to be a worker has undergone a similar retrofit.

With AI already deployed in everything from the fight against ISIS to the hunt for exoplanets and your cat’s Alexa-enabled Friskies order, its application to the office should come as no surprise. As workers pretty much everywhere can attest, today’s office has issues: It can’t intuitively crack a window when your officemate decides to microwave leftover catfish. It seems to willfully disregard your noise, temperature, light, and workflow preferences. And it certainly doesn’t tell its designers – or your manager – what you are really thinking as you plop down in your annoyingly stiff chair to sip your morning cup of mud.

Now, you may be thinking to yourself, “These seem like trivial issues that can be worked out simply by chatting with another human being, so why do we even need AI in my office?” If so, read on. In your lifetime, companies and workers will channel AI to unlock new value – and immense competitive advantage….(More)”.

Mapping the economy in real time is almost ‘within our grasp’


Delphine Strauss at the Financial Times: “The goal of mapping economic activity in real time, just as we do for weather or traffic, is “closer than ever to being within our grasp”, according to Andy Haldane, the Bank of England’s chief economist. In recent years, “data has become the new oil . . . and data companies have become the new oil giants”, Mr Haldane told an audience at King’s Business School …

But economics and finance have been “rather reticent about fully embracing this oil-rush”, partly because economists have tended to prefer a deductive approach that puts theory ahead of measurement. This needs to change, he said, because relying too much on either theory or real-world data in isolation can lead to serious mistakes in policymaking — as was seen when the global financial crisis exposed the “empirical fragility” of macroeconomic models.

Parts of the private sector and academia have been far swifter to exploit the vast troves of ever-accumulating data now available — 90 per cent of which has been created in the last two years alone. Massachusetts Institute of Technology’s “Billion Prices Project”, name-checked in Mr Haldane’s speech, now collects enough data from online retailers for its commercial arm to provide daily inflation updates for 22 economies….

The UK’s Office for National Statistics — which has faced heavy criticism over the quality of its data in recent years — is experimenting with “web-scraping” to collect price quotes for food and groceries, for example, and making use of VAT data from small businesses to improve its output-based estimates of gross domestic product. In both cases, the increased sample size and granularity could bring considerable benefits on top of existing surveys, Mr Haldane said.

The BoE itself is trying to make better use of financial data — for example, by using administrative data on owner-occupied mortgages to better understand pricing decisions in the UK housing market. Mr Haldane sees scope to go further with the new data coming on stream on payment, credit and banking flows. …New data sources and techniques could also help policymakers think about human decision-making — which rarely conforms with the rational process assumed in many economic models. Data on music downloads from Spotify, used as an indicator of sentiment, has recently been shown to do at least as well as a standard consumer confidence survey in tracking consumer spending….(More)”.